nep-cmp New Economics Papers
on Computational Economics
Issue of 2010‒05‒02
fourteen papers chosen by
Stan Miles
Thompson Rivers University

  1. The Computation of Perfect and Proper Equilibrium for Finite Games via Simulated Annealing By Stuart McDonald; Liam Wagner
  2. Long-Run Eects of Post-Kyoto Policies: Applying a Fully Dynamic CGE model with Heterogeneous Capital By Lucas Bretschger; Roger Ramer; Florentine Schwark
  3. Land Conversion and Market Equilibrium – Insights from a Simulated Landscape By Richard Iovanna; Colin Vance
  4. Educational Support and Individual Ability with Endogenous Fertility By OGURO Kazumasa; OSHIO Takashi; TAKAHATA Junichiro
  5. Spatial distribution of innovative activities and economic performances: A geographical-friendly model By Eric BROUILLAT (GREThA UMR CNRS 5113); Yannick LUNG (GREThA UMR CNRS 5113)
  6. Classical vs wavelet-based filters Comparative study and application to business cycle By Ibrahim Ahamada; Philippe Jolivaldt
  7. Competition under Dynamic Lot Sizing Costs with Capacity Acquisition By Hongyan Li; Joern Meissner
  8. Convenient Multiple Directions of Stratification By Benjamin Jourdain; Bernard Lapeyre; Piergiacomo Sabino
  9. A Spatial Agent-Based Model to Explore Scenarios of Adaptation to Climate Change in an Alpine Tourism Destination By Stefano Balbi; Pascal Perez; Carlo Giupponi
  10. Optimal closing of a pair trade with a model containing jumps By Stig Larsson; Carl Lindberg; Marcus Warfheimer
  11. Quantitative Properties of Sovereign Default Models: Solution Methods Matter By Leonardo Martinez; Juan Carlos Hatchondo; Horacio Sapriza
  12. Modularity and Optimality in Social Choice By Gennaro Amendola; Simona Settepanella
  14. Vast Volatility Matrix Estimation using High Frequency Data for Portfolio Selection By Jianqing Fan; Yingying Li; Ke Yu

  1. By: Stuart McDonald (Department of Economics, University of Queensland); Liam Wagner (Department of Economics, University of Queensland)
    Abstract: This paper exploits an analogy between the “trembles” that underlie the functioning of simulated annealing and the player “trembles” that underlie the Nash refinements known as perfect and proper equilibrium. This paper shows that this relationship can be used to provide a method for computing perfect and proper equilibria of n-player strategic games. This paper also shows, by example, that simulated annealing can be used to locate a perfect equilibrium in an extensive form game.
    Keywords: Game Theory
    JEL: C72 C73
    Date: 2010–01
  2. By: Lucas Bretschger (CER-ETH - Center of Economic Research at ETH Zurich, Switzerland); Roger Ramer (CER-ETH - Center of Economic Research at ETH Zurich, Switzerland); Florentine Schwark (CER-ETH - Center of Economic Research at ETH Zurich, Switzerland)
    Abstract: The paper develops a new type of CGE model to predict the effects of carbon policies on consumption, welfare, and sectoral development in the long run. Growth is fully endogenous, based on increasing specialization in capital varieties, and specic in each sector of the economy. The benchmark scenario is calculated based on the endogenous gains from specialization which carry over to policy simulation. Applying the model to the Swiss economy we nd that a carbon policy following the Copenhagen Accord entails a moderate but not negligible welfare loss compared to development without any negative eects of climate change. Energy extensive as well as capital and knowledge intensive sectors prot in the form of increased growth rates.
    Keywords: Carbon policy, CGE models, energy and endogenous growth, heterogeneous capital
    JEL: Q54 C63 O41 Q43 Q56
    Date: 2010–04
  3. By: Richard Iovanna; Colin Vance
    Abstract: open space at equilibrium. Although simple, the system is exceedingly flexible and allows for household and parcel heterogeneity. We derive an empirical model directly from the structural equations and contrast this using a simulated landscape with the econometric specification most often found in the literature. We then show how the model can be used to project land-use change into the future and for policy simulation. Finally, we use the model to examine the impact of common land conservation policies in Europe.
    Keywords: Land-use change; urban sprawl; simulation
    JEL: R15 Q24
    Date: 2010–04
  4. By: OGURO Kazumasa; OSHIO Takashi; TAKAHATA Junichiro
    Abstract: In this paper, we present an OLG simulation model with transmission of individual ability and endogenous fertility in order to capture the effects that strengthening income redistribution, expansion of child benefit, and expansion of educational support have on economic disparity and economic growth. Our simulation results show that expansion of educational support will achieve a reduction in inequality and maintenance or an increase in economic growth. In addition, the effects of expanded educational support are greater with a stronger correlation between parent and child ability. On the other hand, our findings show that policies increasing child benefit or expanded minimum income cannot be expected to lead to reduction in inequality or improvement in economic growth.
    Date: 2010–04
  5. By: Eric BROUILLAT (GREThA UMR CNRS 5113); Yannick LUNG (GREThA UMR CNRS 5113)
    Abstract: The paper identifies 5 stylized facts to characterize the geographic distribution of innovative activities in France (mainly its high concentration in the region Ile-de-France). It proposes an original model of regional growth in a knowledge-based economy considering the density of RD activities and the connectivity to the other regions. The model is computed and run into 2 different configurations: equidistribution and overconcentration. The simulations’ results lead to the conclusion that the equidistribution configuration is Pareto-efficient (higher growth rate of the national economy, lower income spatial inequalities) compared to the overconcentration. Policy implications are discussed in conclusion.
    Keywords: France; Geography of innovation; Knowledge spillover; Regional growth; Simulation
    JEL: O33 R11
    Date: 2010
  6. By: Ibrahim Ahamada (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I); Philippe Jolivaldt (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I)
    Abstract: In this article, we compare the performance of Hodrickk-Prescott and Baxter-King filters with a method of filtering based on the multi-resolution properties of wavelets. We show that overall the three methods remain comparable if the theoretical cyclical component is defined in the usual waveband, ranging between six and thirty two quarters. However the approach based on wavelets provides information about the business cycle, for example, its stability over time which the other two filters do not provide. Based on Monte Carlo simulation experiments, our method applied to the American GDP using growth rate data shows that the estimate of the business cycle component is richer in information than that deduced from the level of GDP and includes additional information about the post 1980 period of great moderation.
    Keywords: Filters, HP, BK, wavelets, Monte Carlo Simulation break, business cycles.
    Date: 2010–03
  7. By: Hongyan Li (Aarhus School of Business, Aarhus University, Denmark); Joern Meissner (Department of Management Science, Lancaster University Management School)
    Abstract: Lot-sizing and capacity planning are important supply chain decisions, and competition and cooperation affect the performance of these decisions. In this paper, we look into the dynamic lot sizing and resource competition problem of an industry consisting of multiple firms. A capacity competition model combining the complexity of time-varying demand with cost functions and economies os scale arising from dynamic lot-sizing costs is developed. Each firm can replenish inventory at the beginning of each period in a finite planning horizon. Fixed as well as variable production costs incur for each production setup, along with inventory carrying costs. The individual production lots of each firm are limited by a constant capacity restriction, which is purchased up front for the planning horizon. The capacity can be purchased from a spot market, and the capacity acquisition cost fluctuates with the total capacity demand of all the competing firms. We solve the competition model and establish the existence of a capacity equilibrium over the firms and the associated optimal dynamic lot-sizing plan for each firm under mild conditions.
    Keywords: computational economics, industrial competition, operations research, game theory, capacity optimization, supply chain management, lot sizing, heuristics, equilibrium
    JEL: C61
    Date: 2010–04
  8. By: Benjamin Jourdain; Bernard Lapeyre; Piergiacomo Sabino
    Abstract: This paper investigates the use of multiple directions of stratification as a variance reduction technique for Monte Carlo simulations of path-dependent options driven by Gaussian vectors. The precision of the method depends on the choice of the directions of stratification and the allocation rule within each strata. Several choices have been proposed but, even if they provide variance reduction, their implementation is computationally intensive and not applicable to realistic payoffs, in particular not to Asian options with barrier. Moreover, all these previously published methods employ orthogonal directions for multiple stratification. In this work we investigate the use of algorithms producing convenient directions, generally non-orthogonal, combining a lower computational cost with a comparable variance reduction. In addition, we study the accuracy of optimal allocation in terms of variance reduction compared to the Latin Hypercube Sampling. We consider the directions obtained by the Linear Transformation and the Principal Component Analysis. We introduce a new procedure based on the Linear Approximation of the explained variance of the payoff using the law of total variance. In addition, we exhibit a novel algorithm that permits to correctly generate normal vectors stratified along non-orthogonal directions. Finally, we illustrate the efficiency of these algorithms in the computation of the price of different path-dependent options with and without barriers in the Black-Scholes and in the Cox-Ingersoll-Ross markets.
    Date: 2010–04
  9. By: Stefano Balbi (Department of Economics, University Of Venice Cà Foscari); Pascal Perez (RMAP, Australian National University, Canberra); Carlo Giupponi (Department of Economics, University Of Venice Cà Foscari)
    Abstract: A vast body of literature suggests that the European Alpine region may be one of the most sensitive to climate change impacts. Adaptation to climate change of Alpine socio-ecosystems is increasingly becoming an issue of interest for the scientific community while the people of the Alps are often unaware of or simply ignore the problem. ClimAlpTour is a European research project of the Alpine Space Programme, bringing together institutions and scholars from all countries of the Alpine arch, in view of dealing with the expected decrease in snow and ice cover, which may lead to a rethinking of tourism development beyond the traditional vision of winter sports. The research reported herein analyses the municipality of Auronzo di Cadore (22,000 ha) in the Dolomites under the famous peaks of the “Tre Cime di Lavaredo”. The local economy depends on tourism which is currently focused on the summer season, while the winter season is weak. As a whole, the destination receives approximately 65,000 guests per year with a resident population of 3,600 inhabitants. Since recently the Community Council is considering options on how to stimulate a further development of the winter tourism. This paper refers to a prototype agent-based model, called AuronzoWinSim, for the assessment of alternative scenarios of future local development strategies, taking into account complex spatial and social dynamics and interactions. Different typologies of winter tourists compose the set of human agents. Climate change scenarios are used to produce temperature and snow cover projections. The model is mainly informed by secondary sources, including demographic and economic time series, and biophysical data which feed-in its spatial dimension. Primary data from field surveys are used to calibrate the main parameters. AuronzoWinSim is planned for use in a participatory context with groups of local stakeholders.
    Keywords: Alpine Winter Tourism, Spatial Agent-Based Model, Climate Change Adaptation
    JEL: Q
    Date: 2010
  10. By: Stig Larsson; Carl Lindberg; Marcus Warfheimer
    Abstract: A pair trade is a portfolio consisting of a long position in one asset and a short position in another, and it is a widely applied investment strategy in the financial industry. Recently, Ekstr\"om, Lindberg and Tysk studied the problem of optimally closing a pair trading strategy when the difference of the two assets is modelled by an Ornstein-Uhlenbeck process. In this paper we study the same problem, but the model is generalized to also include jumps. More precisely we assume that the above difference is an Ornstein-Uhlenbeck type process, driven by a L\'evy process of finite activity. We prove a verification theorem and analyze a numerical method for the associated free boundary problem. We prove rigorous error estimates, which are used to draw some conclusions from numerical simulations.
    Date: 2010–04
  11. By: Leonardo Martinez; Juan Carlos Hatchondo; Horacio Sapriza
    Abstract: We study the sovereign default model that has been used to account for the cyclical behavior of interest rates in emerging market economies. This model is often solved using the discrete state space technique with evenly spaced grid points. We show that this method necessitates a large number of grid points to avoid generating spurious interest rate movements. This makes the discrete state technique significantly more inefficient than using Chebyshev polynomials or cubic spline interpolation to approximate the value functions. We show that the inefficiency of the discrete state space technique is more severe for parameterizations that feature a high sensitivity of the bond price to the borrowing level for the borrowing levels that are observed more frequently in the simulations. In addition, we find that the efficiency of the discrete state space technique can be greatly improved by (i) finding the equilibrium as the limit of the equilibrium of the finite-horizon version of the model, instead of iterating separately on the value and bond price functions and (ii) concentrating grid points in asset levels at which the bond price is more sensitive to the borrowing level and in levels that are observed more often in the model simulations. Our analysis questions the robustness of results in the sovereign default literature and is also relevant for the study of other credit markets.
    Date: 2010–04–16
  12. By: Gennaro Amendola; Simona Settepanella
    Abstract: Marengo and the second author have developed in the last years a geometric model of social choice when this takes place among bundles of interdependent elements, showing that by bundling and unbundling the same set of constituent elements an authority has the power of determining the social outcome. In this paper we will tie the model above to tournament theory, solving some of the mathematical problems arising in their work and opening new questions which are interesting not only from a mathematical and a social choice point of view, but also from an economic and a genetic one. In particular, we will introduce the notion of u-local optima and we study it both from a theoretical and a numerical/probabilistic point of view; we will also describe an algorithm that computes the universal basin of attraction of a social outcome in O(M^3 log M) time (where M is the number of social outcomes)
    Keywords: Social rule, modularity, object, optimum, hyperplane arrangement, tournament, algorithm
    JEL: D71 D72
    Date: 2010–04–22
    Abstract: We develop a new method for deriving minimal state variable (MSV) equilibria of a general class of Markov switching rational expectations models and a new algorithm for computing these equilibria. We compare our approach to pre- viously known algorithms, and we demonstrate that ours is both efficient and more reliable than previous methods in the sense that it is able to find MSV equilibria that previously known algorithms cannot. Further, our algorithm can find all possi- ble MSV equilibria in models where there are multiple MSV equilibria. This feature is essential if one is interested in using a likelihood based approach to estimation.
    Date: 2010–04
  14. By: Jianqing Fan; Yingying Li; Ke Yu
    Abstract: Portfolio allocation with gross-exposure constraint is an effective method to increase the efficiency and stability of selected portfolios among a vast pool of assets, as demonstrated in Fan et al (2008). The required high-dimensional volatility matrix can be estimated by using high frequency financial data. This enables us to better adapt to the local volatilities and local correlations among vast number of assets and to increase significantly the sample size for estimating the volatility matrix. This paper studies the volatility matrix estimation using high-dimensional high-frequency data from the perspective of portfolio selection. Specifically, we propose the use of "pairwise-refresh time" and "all-refresh time" methods proposed by Barndorff-Nielsen et al (2008) for estimation of vast covariance matrix and compare their merits in the portfolio selection. We also establish the concentration inequalities of the estimates, which guarantee desirable properties of the estimated volatility matrix in vast asset allocation with gross exposure constraints. Extensive numerical studies are made via carefully designed simulations. Comparing with the methods based on low frequency daily data, our methods can capture the most recent trend of the time varying volatility and correlation, hence provide more accurate guidance for the portfolio allocation in the next time period. The advantage of using high-frequency data is significant in our simulation and empirical studies, which consist of 50 simulated assets and 30 constituent stocks of Dow Jones Industrial Average index.
    Date: 2010–04

This nep-cmp issue is ©2010 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.