nep-cmp New Economics Papers
on Computational Economics
Issue of 2010‒02‒05
seven papers chosen by
Stan Miles
Thompson Rivers University

  1. Do we believe in climate change? A multi-agent climate-economic model By Sylvie Geisendorf
  2. Credit money and macroeconomic instability in the agent-based model and simulator Eurace By Cincotti, Silvano; Raberto, Marco; Teglio, Andrea
  3. Learning and filtering via simulation: smoothly jittered particle filters By Thomas Flury; Neil Shephard
  4. High-Speed Inter-City Transport System in Japan Past, Present and the Future By Katsuhiro Yamaguchi
  5. A Macrocounterfactual Analysis of Group Differences: An application to an analysis of the gender wage gap in Japan By YAMAGUCHI Kazuo
  6. Segmentation algorithm for non-stationary compound Poisson processes By Bence Toth; Fabrizio Lillo; J. Doyne Farmer
  7. Catching Growth Determinants with the Adaptive Lasso By Ulrike Schneider; Martin Wagner

  1. By: Sylvie Geisendorf (Department of Economics, University of Kassel)
    Abstract: Climate-economic models are mainly intertemporal cost-benefit analyses, trying to balance the damages from climate change against mitigation costs and derive an optimal climate policy. In recent years, the huge importance of uncertainty about climate behaviour impinging such models has been emphasized and it has been argued that one-shot intertemporal optimization is an unrealistic venture. However, only few authors tried to explicitly model the impact of uncertainty on agents’ beliefs and resulting behaviour. Janssen´s (1996) “battle of perspectives” multi-agent climate-economy model is a notable exception. Based on a macro-economic climate-economy model, he implemented adaptive agents, holding different perspectives on the dynamics of climate change and necessary preventive action. The present paper aims to make a case for this model which has gone largely unnoticed by climate economics. It argues for more multi-agent based research in climate economics to analyse the importance of human beliefs. Finally, the paper will update the “battle of perspectives” with current data to investigate the significance of uncertain data for economic climate change models.
    Keywords: climate change, climate-economy models, multi-agent modelling, mitigation, perceptions,bounded rationality, learning
    Date: 2009–12
  2. By: Cincotti, Silvano; Raberto, Marco; Teglio, Andrea
    Abstract: The paper presented a study on the relationship between credit money and economic instability. The issue is of primary importance because, as it is generally stated, lower variability of output and inflation has numerous economic benefits. We address this problem by means of an agent-based model and simulator, called Eurace, which is characterized by a complete set of interrelated markets and different types of interacting agents, modelled according to a rigorous balance-sheet approach. The dynamics of credit money is endogenous and depends on the supply of credit from the banking system, which is constrained by its equity base, and the demand of credit from firms in order to finance their production activity. Alternative dynamic paths for credit money have been produced by setting different firms' dividend policies. Results show the emergence of endogenous business cycles which are mainly due to the interplay between the real economic activity and its financing through the credit market. In particular, the amplitude of the business cycles strongly raises when the fraction of earnings paid out by firms as dividends is higher, that is when firms are more constrained to borrow credit money to fund their activity. This interesting evidence can be explained by the fact that the level of firms leverage, defined as the debt-equity ratio, can be considered ad a proxy of the likelihood of bankruptcy, an event which causes mass layoffs and supply decrease. --
    Keywords: Macroconomic policy design,agent-based computational economics credit money,economic instability
    JEL: E42 E2 E32
    Date: 2010
  3. By: Thomas Flury; Neil Shephard
    Abstract: A key ingredient of many particle filters is the use of the sampling importance resampling algorithm (SIR), which transforms a sample of weighted draws from a prior distribution into equally weighted draws from a posterior distribution. We give a novel analysis of the SIR algorithm and analyse the jittered generalisation of SIR, showing that existing implementations of jittering lead to marked inferior behaviour over the base SIR algorithm. We show how jittering can be designed to improve the performance of the SIR algorithm. We illustrate its performance in practice in the context of three filtering problems.
    Keywords: Importance sampling, Particle filter, Random numbers, Sampling importance resampling, State space models
    JEL: C14 C32
    Date: 2009
  4. By: Katsuhiro Yamaguchi
    Abstract: With the advent of Shinkansen in 1964, a unique inter-city transport network in which high-speed railway and air transport developed simultaneously, emerged in Japan, and modal choice between them based on price and speed has been manifested. Looking ahead, the next generation high-speed transport, the Maglev, is on the horizon. In order to capture the full impacts of the Maglev technology, simulation analysis with a dynamic spatial nested logit model was conducted. From this, we identified a significant opportunity for the Maglev Super-express between Tokyo, Nagoya and Osaka, but with net benefitsexceeding net costs only with an annual economic growth of approximately 2% - 3% achieved in the next 65 years in Japan. If such economic condition were realized, the total air transport market would also continue to grow despite strong competition from the Shinkansen/Maglev system. Another point of interest is Maglev’s impact on reducing global warming. CO2 emission from Maglev is one-third of air transport. Introduction of Maglev Super-express in inter-city transport, however, also attracts passengers from Shinkansen that has five times lower CO2 emission intensity. Indeed, our simulation analysis shows that total CO2 emissions from high-speed inter-city transport increases when Maglev Super-express is introduced. Increase in total CO2 emission from electricity users including Maglev Super-express could be mitigated by energy conversion sector’s effort to reduce CO2 content of electric power supply, for instance, by increasing utilization of nuclear energy. Further research in assessing possible impact of capacity constraint in existing network, not considered in this paper, would facilitate deeper understanding of the future high-speed inter-city transport system.
    Date: 2009–12
  5. By: YAMAGUCHI Kazuo
    Abstract: This paper introduces a new method for a statistical simulation of macrosocietal counterfactual situations. In particular, this method is concerned with decomposing group differences in the mean of a variable into various within-group and between-group components with respect to group categories of intermediary variables. In modeling counterfactual situations, I juxtapose two different mechanisms, the mechanism of realizing the counterfactual state that deviates least from the existing state, and the mechanism of holding other irrelevant-to-counterfactual relations of variables unchanged, and demonstrate that despite the big difference in the mechanisms, the two counterfactual models generally yield highly consistent outcomes. As an illustrative example, the paper analyzes gender inequality in hourly wages in Japan and thereby demonstrates the usefulness of the new method for deriving policy implications.
    Date: 2010–01
  6. By: Bence Toth; Fabrizio Lillo; J. Doyne Farmer
    Abstract: We introduce an algorithm for the segmentation of a class of regime switching processes. The segmentation algorithm is a non parametric statistical method able to identify the regimes (patches) of the time series. The process is composed of consecutive patches of variable length, each patch being described by a stationary compound Poisson process, i.e. a Poisson process where each count is associated to a fluctuating signal. The parameters of the process are different in each patch and therefore the time series is non stationary. Our method is a generalization of the algorithm introduced by Bernaola-Galvan, et al., Phys. Rev. Lett., 87, 168105 (2001). We show that the new algorithm outperforms the original one for regime switching compound Poisson processes. As an application we use the algorithm to segment the time series of the inventory of market members of the London Stock Exchange and we observe that our method finds almost three times more patches than the original one.
    Date: 2010–01
  7. By: Ulrike Schneider; Martin Wagner
    Abstract: This paper uses the adaptive Lasso estimator to determine the variables important for economic growth. The adaptive Lasso estimator is a computationally very simple procedure that can perform at the same time model selection and consistent parameter estimation. The methodology is applied to three data sets, the data used in Sala-i-Martin et al. (2004), in Fernandez et al. (2001) and a data set for the regions in the European Union. The results for the former two data sets are similar in several respects to those found in the published papers, yet are obtained at a negligible fraction of computational cost. Furthermore, the results for the European regional data highlight the importance of human capital for economic growth.
    Keywords: adaptive Lasso, economic convergence, growth regressions, model selection
    JEL: C31 C52 O11 O18 O47
    Date: 2009–06

This nep-cmp issue is ©2010 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.