nep-cmp New Economics Papers
on Computational Economics
Issue of 2016‒04‒09
nine papers chosen by



  1. The Carbon Footprint of European Households and Income Distribution By Mark Sommer; Kurt Kratena
  2. Monetary Policy and Large Crises in a Financial Accelerator Agent-Based Model By Giri, Federico; Riccetti, Luca; Russo, Alberto; Gallegati, Mauro
  3. A new genetic algorithm for the tool indexing problem By Ghosh, Diptesh
  4. Tractable Likelihood-Based Estimation of Non-Linear DSGE Models Using Higher-Order Approximations By Robert Kollmann
  5. When more flexiility yields more fragility : the microfoundations of keynesian aggregate unemployment By G; Dosi; M.C. Pereira; A. Roventini; M.E. Virgillito
  6. Reducing the role of random numbers in matching algorithms for school admission By Hulsbergen, Wouter
  7. Betting and Belief: Prediction Markets and Attribution of Climate Change By John J. Nay; Martin Van der Linden; Jonathan M. Gilligan
  8. Tractable Likelihood-Based Estimation of Non-Linear DSGE Models Using Higher-Order Approximations By Kollmann, Robert
  9. Neural NetsThis paper shows how neural networks may be used to approximate the limited information posterior mean, By Michael Creel

  1. By: Mark Sommer; Kurt Kratena
    Abstract: This paper calculates the CO2e (CO2 equivalents) footprint of private consumption in the EU27 by five groups of household income, using a fully fledged macroeconomic input-output model covering 59 industries and five groups of household income for the EU27. Due to macroeconomic feedback mechanisms, this methodology not only takes into account intermediate demand induced by the demand of a household group, but also: (i) private consumption induced in the other household groups, (ii) impacts on other endogenous final demand components, and (iii) negative feedback effects due to output price effects of household demand. Direct household emissions from household energy consumption are taken into account in a non-linear specification. Emissions embodied in imports are calculated using the results of a static MRIO (Multi-Regional Input-Output) model. The footprint is calculated separately for the consumption vector of each of the five income groups. The simulation results yield an income elasticity of direct and indirect emissions at each income level that takes all macroeconomic feedbacks of consumption into account and differs from the ceteris paribus emission elasticity in the literature. The results further reveal that a small structural ‘Kuznet effect’ exists.
    Keywords: Carbon footprint, CGE modeling, income distribution
    JEL: C67 Q52 Q54
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:feu:wfewop:y:2016:m:3:d:0:i:113&r=cmp
  2. By: Giri, Federico; Riccetti, Luca; Russo, Alberto; Gallegati, Mauro
    Abstract: An accommodating monetary policy followed by a sudden increase of the short term interest rate often leads to a bubble burst and to an economic slowdown. Two examples are the Great Depression of 1929 and the Great Recession of 2008. Through the implementation of an Agent Based Model with a financial accelerator mechanism we are able to study the relationship between monetary policy and large scale crisis events. The main results can be summarized as follow: a) sudden and sharp increases of the policy rate can generate recessions; b) after a crisis, returning too soon and too quickly to a normal monetary policy regime can generate a "double dip" recession, while c) keeping the short term interest rate anchored to the zero lower bound in the short run can successfully avoid a further slowdown.
    Keywords: Monetary Policy; Large Crises; Agent Based Model; Financial Accelerator; Zero Lower Bound.
    JEL: C63 E32 E44 E58
    Date: 2016–03–30
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:70371&r=cmp
  3. By: Ghosh, Diptesh
    Abstract: The tool indexing problem is one of allocating tools to slots in a tool magazine so as to minimize the tool change time in automated machining. This problem has been widely studied in the literature. A genetic algorithm has been suggested in the literature to solve this problem, but its implementation is non-standard. In this paper we describe a permutation based genetic algorithm for the tool indexing problem and compare its performance with an existing genetic algorithm.
    URL: http://d.repec.org/n?u=RePEc:iim:iimawp:14437&r=cmp
  4. By: Robert Kollmann
    Abstract: This paper discusses a tractable approach for computing the likelihood function of non-linear Dynamic Stochastic General Equilibrium (DSGE) models that are solved using second- and third order accurate approximations. By contrast to particle filters, no stochastic simulations are needed for the method here. The method here is, hence, much faster and it is thus suitable for the estimation of medium-scale models. The method assumes that the number of exogenous innovations equals the number of observables. Given an assumed vector of initial states, the exogenous innovations can thus recursively be inferred from the observables. This easily allows to compute the likelihood function. Initial states and model parameters are estimated by maximizing the likelihood function. Numerical examples suggest that the method provides reliable estimates of model parameters and of latent state variables, even for highly non-linear economies with big shocks.
    Keywords: likelihood-based estimation of non-linear DSGE models; higher-order approximations; pruning; latent state variables
    JEL: C63 C68 E37
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/228887&r=cmp
  5. By: G; Dosi (Scuola Superiore Sant'Anna); M.C. Pereira (University of Campinas); A. Roventini (OFCE, Sciences Po, Scuola Superiore Sant'Anna); M.E. Virgillito (Scuola Superiore Sant'Anna)
    Abstract: Wages are an element of cost crucially a ecting the competitiveness of individual rms. Butthe wage bill is also a crucial element of aggregate demand. Hence it could be that more flexible"and fluid labour markets, while allowing for faster inter-firm reallocation of labour, may also render the whole economic system more fragile, more prone to recession, more volatile. In this work we investigate some conditions under which such a conjecture applies. The paper presents an agent-based model that investigates the e ects of two \archetypes of capitalism", in terms of regimes of labour governance { defined by the mechanisms of wage determination, ring, labour protection and productivity gains sharing upon (i) labour market regularities and (ii) macroeconomic dynamics (long-term rates of growth, GDP fluctuations, unemployment rates, inequality, etc..). The model is built upon the \Keynes meets Schumpeter" family of models (Dosi et al.,2010), explicitly incorporating di erent microfounded labour market regimes. Our results show that seemingly more rigid labour markets and labour relations are conducive to coordination successes with higher and smoother growth
    Keywords: Involuntary Unemployment,Aggregate Demand, Wage determination, Labour market regimes, Keynesian coordination failures, agent-based models.
    JEL: C63 E02 E12 E24
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:fce:doctra:1607&r=cmp
  6. By: Hulsbergen, Wouter
    Abstract: New methods for solving the college admissions problem with indifference are presented and characterised with a Monte Carlo simulation in a variety of simple scenarios. Based on a qualifier defined as the average rank, it is found that these methods are more efficient than the Boston and Deferred Acceptance algorithms. The improvement in efficiency is directly related to the reduced role of random tie-breakers. The strategy-proofness of the new methods is assessed as well.
    Keywords: college admission problem; deferred acceptance algorithm; Boston algorithm; Zeeburg algorithm; pairwise exchange algorithm; strategic behaviour
    JEL: I2
    Date: 2016–03–12
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:70374&r=cmp
  7. By: John J. Nay; Martin Van der Linden; Jonathan M. Gilligan
    Abstract: Despite much scientific evidence, a large fraction of the American public doubts that greenhouse gases are causing global warming. We present a simulation model as a computational test-bed for climate prediction markets. Traders adapt their beliefs about future temperatures based on the profits of other traders in their social network. We simulate two alternative climate futures, in which global temperatures are primarily driven either by carbon dioxide or by solar irradiance. These represent, respectively, the scientific consensus and a hypothesis advanced by prominent skeptics. We conduct sensitivity analyses to determine how a variety of factors describing both the market and the physical climate may affect traders' beliefs about the cause of global climate change. Market participation causes most traders to converge quickly toward believing the "true" climate model, suggesting that a climate market could be useful for building public consensus.
    Date: 2016–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1603.08961&r=cmp
  8. By: Kollmann, Robert
    Abstract: This paper discusses a tractable approach for computing the likelihood function of non-linear Dynamic Stochastic General Equilibrium (DSGE) models that are solved using second- and third order accurate approximations. By contrast to particle filters, no stochastic simulations are needed for the method here. The method here is, hence, much faster and it is thus suitable for the estimation of medium-scale models. The method assumes that the number of exogenous innovations equals the number of observables. Given an assumed vector of initial states, the exogenous innovations can thus recursively be inferred from the observables. This easily allows to compute the likelihood function. Initial states and model parameters are estimated by maximizing the likelihood function. Numerical examples suggest that the method provides reliable estimates of model parameters and of latent state variables, even for highly non-linear economies with big shocks.
    Keywords: Likelihood-based estimation of non-linear DSGE models, higher-order approximations, pruning, latent state variables
    JEL: C6 E3
    Date: 2016
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:70350&r=cmp
  9. By: Michael Creel
    Abstract: This paper shows how neural networks may be used to approximate the limited information posterior mean of a simulable model. Because the model is simulable, training and testing samples may be generated with sizes large enough to train well a net that is large enough, in terms of number of hidden layers and neurons, to learn the limited information posterior mean with good accuracy. The output of the net can be used as an estimator of the parameter, or, following Jiang et al. (2015), as an input to subsequent classical or Bayesian indirect inference estimation. Targeting the limited information posterior mean using neural nets is simpler, faster, and more successful than is targeting the full information posterior mean. Code to replicate the examples and to use the methods for other models is available at https://github.com/mcreel/NeuralNetsForIndirectInference.jl. This code uses the Mocha.jl package for the Julia language, which allows for easy access to GPU computing, which greatly accelerates training the net.
    Keywords: indirect inference; neural networks; approximate Bayesian Computing; machine learning
    Date: 2016–04–01
    URL: http://d.repec.org/n?u=RePEc:aub:autbar:960.16&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.