nep-cmp New Economics Papers
on Computational Economics
Issue of 2010‒07‒31
fifteen papers chosen by
Stan Miles
Thompson Rivers University

  1. Robust Portfolio Optimization with a Hybrid Heuristic Algorithm By Björn Fastrich; Peter Winker
  2. Predicting bank loan recovery rates with neural networks By Joao A. Bastos
  3. Effet du sentiment de discrimination sur les trajectoires professionnelles. By Olivier Joseph; Séverine Lemière; Laurence Lizé; Patrick Rousset
  4. Economics of export taxation in a context of food crisis By Bouet, Antoine; Laborde Debucquet, David
  5. The EAGLE. A model for policy analysis of macroeconomic interdependence in the euro area By Sandra Gomes; P. Jacquinot; M. Pisani
  6. Lagrangean decomposition for large-scale two-stage stochastic mixed 0-1 problems. By Laureano F. Escudero Bueno; María Araceli Garín Martín; Gloria Pérez Sainz de Rozas; Aitziber Unzueta Inchaurbe
  7. An organization that transmits opinion to newcomers By Juliette Rouchier; Paola Tubaro
  8. Speeding Up the Estimation of Expected Maximum Flows Through Reliable Networks By Megha Sharma; Diptesh Ghosh
  9. Eight years of Doha trade talks By Bouet, Antoine; Laborde Debucquet, David
  10. On a learning precedence graph concept for the automotive industry By Christian Otto; Armin Scholl; H. Klindworth
  11. Evolutionary Stability of First Price Auctions By Fernando Louge; Frank Riedel
  12. A Time-Invariant Duration Policy under the Zero Lower Bound By Kozo Ueda
  13. Hedonic Wage Equilibrium: Theory, Evidence and Policy By Kniesner, Thomas J.; Leeth, John D.
  14. The tax treatment of company cars, commuting and optimal congestion taxes By De Borger B.; Wuyts B.
  15. World Food Prices and Monetary Policy By Roberto Chang; Luis Catão

  1. By: Björn Fastrich; Peter Winker
    Abstract: Estimation errors in both the expected returns and the covariance matrix hamper the constructing of reliable portfolios within the Markowitz framework. Robust techniques that incorporate the uncertainty about the unknown parameters are suggested in the literature. We propose a modification as well as an extension of such a technique and compare both with another robust approach. In order to eliminate oversimplifications of Markowitz’ portfolio theory, we generalize the optimization framework to better emulate a more realistic investment environment. Because the adjusted optimization problem is no longer solvable with standard algorithms, we employ a hybrid heuristic to tackle this problem. Our empirical analysis is conducted with a moving time window for returns of the German stock index DAX100. The results of all three robust approaches yield more stable portfolio compositions than those of the original Markowitz framework. Moreover, the out-of-sample risk of the robust approaches is lower and less volatile while their returns are not necessarily smaller.
    Keywords: Hybrid heuristic algorithm, Markowitz, Robust optimization, Uncertainty sets.
    Date: 2010–07–27
  2. By: Joao A. Bastos (CEMAPRE, School of Economics and Management (ISEG), Technical University of Lisbon)
    Abstract: This study evaluates the performance of feed-forward neural networks to model and forecast recovery rates of defaulted bank loans. In order to guarantee that the predictions are mapped into the unit interval, the neural networks are implemented with a logistic activation function in the output neuron. The statistical relevance of explanatory variables is assessed using the bootstrap technique. The results indicate that the variables which the neural network models use to derive their output coincide, to a great extent, with those that are significant in parametric fractional regression models. Out-of-sample estimates of prediction errors are evaluated. The results suggest that neural networks may have better predictive ability than fractional regression models, provided the number of observations is sufficiently large.
    Keywords: Loss given default, Recovery rate, Forecasting, Bank loan, Fractional regression, Neural network
    JEL: G21 G33
    Date: 2010–07
  3. By: Olivier Joseph (Céreq); Séverine Lemière (IUT Paris Descartes et Centre d'Economie de la Sorbonne); Laurence Lizé (Centre d'Economie de la Sorbonne); Patrick Rousset (Céreq)
    Abstract: This paper deals with young people who declare themselves discriminated because of their ethnicity or colour. The aim is to evaluate the effect of the discrimination feeling on professional paths for young people 7 years after having left school. We use the statistical survey of Cereq Génération 97 (7years) and the clustering method using self-organizing maps (Kohonen algorithm). Eight classes of professional paths are exposed. Two kinds of segmentation of professional paths are obtained. The "inter-class" segmentation : young people who declare themselves discriminated are more present in temporary-work class or unemployment class. This finding is consolidated by some qualitative talks with young people who declare themselves discriminated. They explained the refusal of victimization in the labour market. The "intra-class" segmentation : a lot of inequalities exit inside classes. Working full time or become a manager is more difficult for young people who declare themselves discriminated even if their professional path is good.
    Keywords: Professional path, feeling of discrimination, classification, segmentation, ethnicity.
    JEL: J71
    Date: 2010–07
  4. By: Bouet, Antoine; Laborde Debucquet, David
    Abstract: This paper aims to assess the rationales for the use of export taxes, in particular in the context of a food crisis. First, we summarize the effects of export taxes using both partial and general equilibrium theoretical models. When large countries have an objective of constant food domestic prices, in the event of an increase in world agricultural prices the optimal response is to decrease import tariffs in net food-importing countries and to increase export tariffs in net food-exporting countries. The latter decision is welfare improving while the former is welfare reducing: it is the price to pay to get domestic food prices constant. Small countries are harmed by both decisions. Second, we illustrate the costs of a lack of cooperation in and regulation of (binding process) such policies in a time of crisis using a global computable general equilibrium (CGE) model illustration, mimicking the mechanisms that have appeared during the recent food price surge. We conclude with a call for international regulation, in particular because small net food-importing countries may be substantially harmed by these beggar-thy-neighbor policies that amplify the already negative impact of the food crisis.
    Keywords: Computable general equilibrium (CGE), export taxes, Food crisis, optimum tariff,
    Date: 2010
  5. By: Sandra Gomes; P. Jacquinot; M. Pisani
    Abstract: Building on the New Area Wide Model, we develop a 4-region macroeconomic model of the euro area and the world economy. The model (EAGLE, Euro Area and Global Economy model) is microfounded and designed for conducting quantitative policy analysis of macroeconomic interdependence across regions belonging to the euro area and between euro area regions and the world economy. Simulation analysis shows the transmission mechanism of region-specific or common shocks, originating in the euro area and abroad.
    JEL: C53 E32 E52 F47
    Date: 2010
  6. By: Laureano F. Escudero Bueno (Dpto. Estadística e Investigación Operativa. Universidad Rey Juan Carlos, Madrid.); María Araceli Garín Martín (Dpto. Economía Aplicada III. Universidad del País Vasco, Bilbao.); Gloria Pérez Sainz de Rozas (Dpto. Matemática Aplicada Estadística e I.O. Universidad del País Vasco, Leioa.); Aitziber Unzueta Inchaurbe (Dpto. Economía Aplicada III. Universidad del País Vasco, Bilbao.)
    Abstract: In this paper we study solution methods for solving the dual problem corresponding to the Lagrangean Decomposition of two stage stochastic mixed 0-1 models. We represent the two stage stochastic mixed 0-1 problem by a splitting variable representation of the deterministic equivalent model, where 0-1 and continuous variables appear at any stage. Lagrangean Decomposition is proposed for satisfying both the integrality constraints for the 0-1 variables and the non-anticipativity constraints. We compare the performance of four iterative algorithms based on dual Lagrangean Decomposition schemes, as the Subgradient method, the Volume algorithm, the Progressive Hedging algorithm and the Dynamic Constrained Cutting Plane scheme. We test the conditions and properties of convergence for medium and large-scale dimension stochastic problems. Computational results are reported.
    Keywords: Lagrangean Decomposition, Subgradient method, Volume algorithm, Progressive Hedging algorithm and Dynamic Constrained Cutting Plane method
    JEL: C6 C61 C63
    Date: 2010–07–20
  7. By: Juliette Rouchier (GREQAM - Groupement de Recherche en Économie Quantitative d'Aix-Marseille - Université de la Méditerranée - Aix-Marseille II - Université Paul Cézanne - Aix-Marseille III - Ecole des Hautes Etudes en Sciences Sociales (EHESS) - CNRS : UMR6579); Paola Tubaro (Department of International Business and Economics - University of Greenwich)
    Abstract: We aim to identify the conditions under which social influence enables emergence of a shared opinion orientation among members of an organization over time, when membership is subject to continuous but partial turnover. We study an intra-organizational advice network that channels social influence over time, with a flow of joiners and leavers at regular intervals. We have been particularly inspired by a study of the Commercial Court of Paris, a judicial institution whose members are peer-elected businesspeople and are partly replaced every year. We develop an agent-based simulation of advice network evolution which incorporates a model of opinion dynamics based on a refinement of Deuant's relative agreement", combining opinion with a measure of "uncertainty" or openness to social influence. We focus on the effects on opinion of three factors, namely criteria for advisor selection, duration of membership in the organization, and new members' uncertainty. We show that criteria for interlocutor choice matter: a shared opinion is sustained over time if members select colleagues at least as experienced as themselves. Convergence of opinions appears in other congurations too, but the impact of initial opinion fades in time. Duration has an impact to the extent that the longer the time spent in the group, the stronger the possibility for convergence towards a common opinion. Finally, higher uncertainty reinforces convergence while lower uncertainty leads to coexistence of multiple opinions.
    Keywords: social influence, advice networks, intra-organizational networks, opinion dynamics, agent-based simulation
    Date: 2010–07–16
  8. By: Megha Sharma; Diptesh Ghosh
    Abstract: In this paper we present a strategy for speeding up the estimation of expected maximum flows through reliable networks. Our strategy tries to minimize the repetition of computational effort while evaluating network states sampled using the crude Monte Carlo method. Computational experiments with this strategy on three types of randomly generated networks show that it reduces the number of flow augmentations required for evaluating the states in the sample by as much as 52% on average with a standard deviation of 7% compared to the conventional strategy. This leads to an average time saving of about 71% with a standard deviation of about 8%. [W.P. No. 2009-04-05]
    Keywords: Network Flows; Reliable Networks; Cold Start; Warm Start; Reliable Network Evaluation Strategy
    Date: 2010
  9. By: Bouet, Antoine; Laborde Debucquet, David
    Abstract: In 2001, the World Trade Organization launched a highly ambitious program of multilateral liberalization. Eight years later, concluding the negotiations is uncertain, though an opportunity still exists. Since 2001, many proposals on market access have been brought to the negotiating table by the E.U., the United States, and the G-20. Because it is politically and economically acceptable to many parties, the final December 2008 package could be the basis of an agreement. An evaluation of these various proposals shows how trade negotiations have been following countries’ strategic interests. In eight years, the ambition of the formula to reduce agricultural market access tariffs has increased, but flexibilities added to accommodate domestic political constraints have offset delivered market access. The December 2008 package would reduce these average tariffs by 25 percent, a reduction very close to the one implied by the Harbinson and Girard proposals of 2003. This has to be compared with the 73 percent reduction in world agricultural protection by the very ambitious 2005 U.S. proposal. The 2005 G-20 and E.U. proposals were intermediate outcomes. The December 2008 package implies a reduction of agricultural protection by 6 percentage points in high-income countries and 0.5 percentage points in middle-income countries. If the U.S. proposal had been applied, these figures would have been 12.4 and 4.7, respectively. Different scenarios imply losses for developing countries, reflecting eroding preferences and rising terms of trade for imported commodities, including food products. We study how this trade reform can be more development-friendly.
    Keywords: Computable general equilibrium (CGE) modeling, Developing countries, Trade negotiations, WTO Doha round,
    Date: 2010
  10. By: Christian Otto (School of Economics and Business Administration, Friedrich-Schiller-University Jena); Armin Scholl (School of Economics and Business Administration, Friedrich-Schiller-University Jena); H. Klindworth
    Abstract: Assembly line balancing problems (ALBP) consist in assigning the total workload for manufacturing a product to stations of an assembly line as typically applied in automotive industry. The distribution of the tasks to the stations is due to restrictions which can be expressed in a precedence graph. However, automotive manufacturers usually do not know complete precedence graphs describing the production processes of their models. Unfortunately, the known approaches for graph generation are not suitable for the conditions in the automotive industry. We describe a new graph generation approach that is based on learning from past production sequences and forms a sufficient precedence graph. This graph, indeed, restricts the ALBP instance but guarantees feasible line balances. Computational experiments indicate that the proposed procedure is able to approximate the real precedence graph sufficiently well to detect optimal or nearly optimal solutions for all instances of a benchmark data set. So, the new approach is applicable and effective and might be a major step to close the gap between theoretical line balancing research and practice of assembly line planning.
    Keywords: assembly, line balancing, auto industry, manufacturing process, precedence graph, learning approach, decision support
    Date: 2010–07–16
  11. By: Fernando Louge (Institute of Mathematical Economics, Bielefeld University); Frank Riedel (Institute of Mathematical Economics, Bielefeld University)
    Abstract: This paper studies the evolutionary stability of the unique Nash equilibrium of a first price sealed bid auction. It is shown that the Nash equilibrium is not asymptotically stable under payoff monotonic dynamics for arbitrary initial popu- lations. In contrast, when the initial population includes a continuum of strategies around the equilibrium, the replicator dynamic does converge to the Nash equilibrium. Simulations are presented for the replicator and Brown-von Neumann-Nash dynamics. They suggest that the convergence for the replicator dynamic is slow compared to the Brown-von Neumann-Nash dynamics.
    JEL: C73 D44
    Date: 2010–06
  12. By: Kozo Ueda (Director and Senior Economist, Institute for Monetary and Economic Studies, Bank of Japan (E-mail: kouzou.ueda
    Abstract: Optimal commitment policy under the zero lower bound entails a high degree of complexity and time-inconsistency in a stochastic economy. This paper proposes a time-invariant duration policy that mitigates those problems and facilitates policy implementation and communication while retaining effectiveness in inflation stabilization. Under the time- invariant duration policy, a central bank commits itself to maintaining low interest rates for some duration even after adverse shocks disappear, but unlike the optimal commitment policy, the duration is independent of the ex post spell of the adverse shocks. Consequently, the time- inconsistency problem does not increase even if the ex post spell of the adverse shocks lengthens, and policy rates are expressed in an extremely simple, explicit form. Simulation results suggest that the time-invariant duration policy performs virtually as effectively as the optimal commitment policy in stabilizing inflation, and far better than a discretionary policy and simple interest rate rules with or without inertia.
    Keywords: Zero lower bound on nominal interest rates, optimal monetary policy, liquidity trap, time-inconsistency
    JEL: E31 E52
    Date: 2010–07
  13. By: Kniesner, Thomas J. (Syracuse University); Leeth, John D. (Bentley University)
    Abstract: We examine theoretically and empirically the properties of the equilibrium wage function and its implications for policy. Our emphasis is on how the researcher approaches economic and policy questions when there is labor market heterogeneity leading to a set of wages. We focus on the application where hedonic models have been most successful at clarifying policy relevant outcomes and policy effects, that of the wage premia for fatal injury risk. Estimates of the overall hedonic locus we discuss imply the so-called value of a statistical life (VSL) that is useful as the benefit value in a cost-effectiveness calculation of government programs to enhance personal safety. Additional econometric results described are the multiple dimensions of heterogeneity in VSL, including by age and consumption plans, the latent trait that affects wages and job safety setting choice, and family income. Simulations of hedonic market outcomes are also valuable research tools. To demonstrate the additional usefulness of giving detail to the underlying structure we not only develop the issue of welfare comparisons theoretically but also illustrate how numerical simulations of the underlying structure can also be informative. Using a reasonable set of primitives we see that job safety regulations are much more limited in their potential for improving workplace safety efficiently compared to mandatory injury insurance with experience rated premiums. The simulations reveal how regulations incent some workers to take more dangerous jobs, while workers’ compensation insurance does not (or less so).
    Keywords: quantile regression, panel data, VSL, job safety, hedonic labor market equilibrium, OSHA, workers’ compensation insurance
    JEL: J2 J3
    Date: 2010–07
  14. By: De Borger B.; Wuyts B.
    Abstract: In Europe, the preferential tax treatment of company cars implies that many employees receive a company car as part of their compensation package. In this paper, we consider a model in which wages and the decision whether or not to provide a company car are the result of direct negotiation between employer and employee. Using this framework, we theoretically and numerically study first- and second-best optimal tax policies on labour and transport markets, focusing on the role of the tax treatment of company cars. We show that higher labour taxes and a more favourable tax treatment of company cars raise the fraction employees that receives a company car; congestion and congestion tolls reduce it. More importantly, we find that earlier models that ignored the preferential tax treatment of company cars may have substantially underestimated optimal congestion tolls in Europe. The numerical illustration, calibrated using Belgian data, suggests that about one third of the optimal congestion toll is due to the current tax treatment of company cars. We further find that eliminating the preferential tax treatment of company cars is an imperfect -- but easy to implement -- substitute for currently unavailable congestion tolls: it yields about half the welfare gain attainable through optimal congestion taxes. Finally, the favourable tax treatment of company cars justifies large public transport subsidies; the numerical results are consistent with zero public transport fares.
    Date: 2010–07
  15. By: Roberto Chang; Luis Catão
    Abstract: The large swings in world food prices in recent years renew interest in the question of how monetary policy in small open economies should react to such imported price shocks. We examine this issue in a canonical open economy setting with sticky prices and where food plays a distinctive role in utility. We show how world food price shocks affect natural output and other aggregates, and derive a second order approximation to welfare. Numerical calibrations show broad CPI targeting to be welfare-superior to alternative policy rules once the variance of food price shocks is sufficiently large as in real world data.
    Date: 2010–07–13

This nep-cmp issue is ©2010 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.