nep-cmp New Economics Papers
on Computational Economics
Issue of 2017‒05‒14
eleven papers chosen by
Stan Miles
Thompson Rivers University

  1. The Creative Response and the Endogenous Dynamics of Pecuniary Knowledge Externalities: An Agent Based Simulation Model. By Antonelli, Cristiano; Ferraris, Gianluigi
  2. Capital Requirements, Risk-Taking and Welfare in a Growing Economy By Pierre-Richard Agénor; Luiz A. Pereira da Silva
  3. Would a euro's depreciation improve the French economy? By Riccardo Magnani; Luca Piccoli; Martine Carré; Amedeo Spadaro
  4. Machine Learning for Better Models for Predicting Bond Prices By Swetava Ganguli; Jared Dunnmon
  5. Clustering and forecasting inflation expectations using the World Economic Survey: the case of the 2014 oil price shock on inflation targeting countries By Hector M. Zarate-Solano; Daniel R. Zapata-Sanabria
  6. Causes and Consequences of Hysteresis: Aggregate Demand, Productivity and Employment By Dosi, G.; Pereira, M. C.; Roventini, A.; Virgillito, M. E.
  7. Machine Learning Techniques for Mortality Modeling By Philippe Deprez; Pavel V. Shevchenko; Mario V. W\"uthrich
  8. Economic Geography in R: Introduction to the EconGeo package By Pierre-Alexandre Balland Author-X-Name-First: Pierre-Alexandre
  9. The marriage gap: Optimal aging and death in partnerships By Schünemann, Johannes; Strulik, Holger; Trimborn, Timo
  10. Double machine learning for treatment and causal parameters By Victor Chernozhukov; Denis Chetverikov; Mert Demirer; Esther Duflo; Christian Hansen; Whitney K. Newey
  11. Reassessing Railroads and Growth: Accounting for Transport Network Endogeneity By Swisher IV, S. N.

  1. By: Antonelli, Cristiano; Ferraris, Gianluigi (University of Turin)
    Abstract: The paper elaborates an agent based simulation model (ABM) to explore the endogenous long-term dynamics of knowledge externalities. ABMs, as a form of artificial cliometrics, allow the analysis of the effects of the reactivity of firms caught in out-of-equilibrium conditions conditional on the levels of endogenous knowledge externalities stemming from the levels of knowledge connectivity of the system. The simulation results confirm the powerful effects of endogenous knowledge externalities. At the micro-level, the reactions of firms caught in out-ofequilibrium conditions yield successful effects in the form of productivity enhancing innovations, only in the presence of high levels of knowledge connectivity and strong pecuniary knowledge externalities. At the meso-level, the introduction of innovations changes the structural characteristics of the system in terms of knowledge connectivity that affect the availability of knowledge externalities. Endogenous centrifugal and centripetal forces continually reshape the structure of the system and its knowledge connectivity. At the macro system level, an out-of-equilibrium process leads to a step-wise increase in productivity combined with non-linear patterns of output growth characterized by significant oscillations typical of the long waves in Schumpeterian business cycles.
    Date: 2017–03
  2. By: Pierre-Richard Agénor; Luiz A. Pereira da Silva
    Abstract: The effects of capital requirements on risk-taking and welfare are studied in a stochastic overlapping generations model of endogenous growth with banking, limited liability, and government guarantees. Capital producers face a choice between a safe technology and a risky (but socially inefficient) technology, and bank risk-taking is endogenous. Setting the capital adequacy ratio above a structural threshold can eliminate the equilibrium with risky loans (and thus inefficient risk-taking), but numerical simulations show that this may entail a welfare loss. In addition, the optimal ratio may be too high in practice and may concomitantly require a broadening of the perimeter of regulation and a strengthening of financial supervision to prevent disintermediation and distortions in financial markets.
    Keywords: Capital Requirements, Bank risk-taking, Investment, Financial Stability, Economic Growth, Capital Goods, Financial Regulation, Financial Intermediaries, Financial Markets, risky investments, financial regulation, financial stability
    JEL: O41 G28 E44
    Date: 2017–03
  3. By: Riccardo Magnani (CEPN - Centre d'Economie de l'Université Paris Nord - Université Paris 13 - USPC - Université Sorbonne Paris Cité - CNRS - Centre National de la Recherche Scientifique); Luca Piccoli (Université des Iles Baléares - Arqueobalear); Martine Carré (LEDa - Laboratoire d'Economie de Dauphine - Université Paris-Dauphine); Amedeo Spadaro (PSE - Paris-Jourdan Sciences Economiques - ENS Paris - École normale supérieure - Paris - EHESS - École des hautes études en sciences sociales - ENPC - École des Ponts ParisTech - CNRS - Centre National de la Recherche Scientifique)
    Abstract: In this paper, we use a Micro-Macro mo del to evaluate the e ects of a euro's depreciation on the French economy, both at the macro and micro level. Our Micro-Macro model consists of a Microsimulation mo del that includes an arithmetical mo del for the French fiscal system and two behavioral mo dels used to simulate the eff ects on consumption b ehavior and lab or supply, and a multisectoral CGE model which simulates the macro economic e ects of a reform or a shock. The integration of the two mo dels is made using an iterative (or sequential) approach. We find that a 10% euro's depreciation stimulates the aggregate demand by increasing exports and reducing imports which increases production and reduces the unemployment rate in the economy. At the individual level, we nd that the macro economic shock reduces poverty and, to a lesser extent, income inequality. In particular, the decrease in the equilibrium wage, determined in the macro mo del, slightly reduces the available income for people who have already a job, while the reduction in the level of unemployment permits to some individuals to find a job, substantially increasing their income and, in many cases, bringing them out of poverty.
    Keywords: Exchange rates,Microsimulation,CGE models
    Date: 2017–04–28
  4. By: Swetava Ganguli; Jared Dunnmon
    Abstract: Bond prices are a reflection of extremely complex market interactions and policies, making prediction of future prices difficult. This task becomes even more challenging due to the dearth of relevant information, and accuracy is not the only consideration--in trading situations, time is of the essence. Thus, machine learning in the context of bond price predictions should be both fast and accurate. In this course project, we use a dataset describing the previous 10 trades of a large number of bonds among other relevant descriptive metrics to predict future bond prices. Each of 762,678 bonds in the dataset is described by a total of 61 attributes, including a ground truth trade price. We evaluate the performance of various supervised learning algorithms for regression followed by ensemble methods, with feature and model selection considerations being treated in detail. We further evaluate all methods on both accuracy and speed. Finally, we propose a novel hybrid time-series aided machine learning method that could be applied to such datasets in future work.
    Date: 2017–03
  5. By: Hector M. Zarate-Solano (Banco de la República de Colombia); Daniel R. Zapata-Sanabria (Banco de la República de Colombia)
    Abstract: This paper examines inflation expectations of the World Economic Survey for ten inflation targeting countries. First, by a Self Organizing Maps methodology, we cluster the trajectory of agents inflation expectations using the beginning of the oil price shock occurred in June of 2014 as a benchmark in order to discriminate between those countries that anticipated the shock smoothly and those with brisk changes in expectations. Then, the expectations are modeled by artificial neural networks forecasting models. Second, for each country we investigate the information content of the quantitative survey forecast by comparing it to the average annual inflation based on national consumer price indices. The results indicate the presence of heterogeneity among countries to anticipate inflation under the oil shock and, also different patterns of accuracy to predict average annual inflation were found depending on the observed inflation trend. Classification JEL: C02, C222, C45, C63, E27
    Keywords: Inflation expectations, machine learning, self-organizing maps, nonlinear autoregressive neural network, expectation surveys
    Date: 2017–05
  6. By: Dosi, G.; Pereira, M. C.; Roventini, A.; Virgillito, M. E.
    Abstract: In this work we develop an agent-based model where hysteresis in major macroeconomic variables (e.g. GDP, productivity, unemployment) emerges out of the decentralized interactions of heterogenous firms and workers. Building upon the model in Dosi et al. (2016, 2017), we specify an endogenous process of accumulation of workers' skills and a state-dependent process of entry, studying their hysteretic impacts. Indeed, hysteresis is ubiquitous. However, this is not due to market imperfections, but rather to the very functioning of decentralised economies characterised by coordination externalities and dynamic increasing returns. So, contrary to the insider-outsider hypothesis (Blanchard and Summers, 1986), the model does not support the findings that rigid industrial relations may foster hysteretic behaviour in aggregate unemployment. On the contrary, in line with the recent discussion in Ball et al. (2014), this contribution provides evidence that during severe downturns, and thus declining aggregate demand, phenomena like lower investment and innovation rates, skills deterioration, and declining entry dynamics are better candidates to explain long-run unemployment spells and lower output growth. In that, more rigid labour markets dampen hysteretic dynamics by supporting aggregate demand, thus making the economy more resilient.
    Keywords: Hysteresis,Aggregate Demand,Multiple Equilibria,Skills Deterioration,Market Entry,Agent-Based Model
    JEL: C63 E02 E24
    Date: 2017
  7. By: Philippe Deprez; Pavel V. Shevchenko; Mario V. W\"uthrich
    Abstract: Various stochastic models have been proposed to estimate mortality rates. In this paper we illustrate how machine learning techniques allow us to analyze the quality of such mortality models. In addition, we present how these techniques can be used for differentiating the different causes of death in mortality modeling.
    Date: 2017–05
  8. By: Pierre-Alexandre Balland Author-X-Name-First: Pierre-Alexandre
    Abstract: The R statistical software is increasingly used to perform analysis on the spatial distribution of economic activities. It contains state-of-the-art statistical and graphical routines not yet available in other software such as SAS, Stata, or SPSS. R is also free and open-source. Many graduate students and researchers, however, find programming in R either too challenging or end up spending a lot of their precious time solving trivial programming tasks. This paper is a simple introduction on how to do economic geography in R using the EconGeo package (Balland, 2017). Users do not need extensive programming skills to use it. EconGeo allows to easily compute a series of indices commonly used in the fields of economic geography, economic complexity, and evolutionary economics to describe the location, distribution, spatial organization, structure, and complexity of economic activities. Functions include basic spatial indicators such as the location quotient, the Krugman specialization index, the Herfindahl or the Shannon entropy indices but also more advanced functions to compute different forms of normalized relatedness between economic activities or network-based measures of economic complexity. By opening and sharing the codes used to compute popular indicators of the spatial distribution of economic activities, one of the goals of this package is to make peer-reviewed empirical studies more reproducible by a large community of researchers.
    JEL: R
    Date: 2017–05
  9. By: Schünemann, Johannes; Strulik, Holger; Trimborn, Timo
    Abstract: Married people live longer than singles but how much of the longevity differential is causal and what the particular mechanisms are is not fully understood. In this paper we propose a new approach, based on counterfactual computational experiments, in order to asses how much of the marriage gap can be explained by public-goods sharing and collective bargaining of partners with different preferences and biology. For that purpose we integrate cooperative decision making of a couple into a biologically-founded life-cycle model of health deficit accumulation and endogenous longevity. We calibrate the model with U.S. data and perform the counterfactual experiment of preventing the partnership. We elaborate three economic channels and find that, as singles, men live 8.5 months shorter and women 6 months longer. We conclude that about 30% of the marriage gain in longevity of men can be motivated by economic calculus while the marriage gain for women observed in the data is attributed to selection or other (non-standard economic) motives.
    Keywords: health,aging,longevity,marriage-gap,gender-specific preferences,unhealthy behavior
    JEL: D91 J17 J26 I12
    Date: 2017
  10. By: Victor Chernozhukov (Institute for Fiscal Studies and MIT); Denis Chetverikov (Institute for Fiscal Studies and UCLA); Mert Demirer (Institute for Fiscal Studies); Esther Duflo (Institute for Fiscal Studies); Christian Hansen (Institute for Fiscal Studies and Chicago GSB); Whitney K. Newey (Institute for Fiscal Studies and MIT)
    Abstract: Most modern supervised statistical/machine learning (ML) methods are explicitly designed to solve prediction problems very well. Achieving this goal does not imply that these methods automatically deliver good estimators of causal parameters. Examples of such parameters include individual regression coffiecients, average treatment e ffects, average lifts, and demand or supply elasticities. In fact, estimators of such causal parameters obtained via naively plugging ML estimators into estimating equations for such parameters can behave very poorly. For example, the resulting estimators may formally have inferior rates of convergence with respect to the sample size n caused by regularization bias. Fortunately, this regularization bias can be removed by solving auxiliary prediction problems via ML tools. Speci ficially, we can form an efficient score for the target low-dimensional parameter by combining auxiliary and main ML predictions. The efficient score may then be used to build an efficient estimator of the target parameter which typically will converge at the fastest possible 1/v n rate and be approximately unbiased and normal, allowing simple construction of valid con fidence intervals for parameters of interest. The resulting method thus could be called a "double ML" method because it relies on estimating primary and auxiliary predictive models. Such double ML estimators achieve the fastest rates of convergence and exhibit robust good behavior with respect to a broader class of probability distributions than naive "single" ML estimators. In order to avoid overfi tting, following [3], our construction also makes use of the K-fold sample splitting, which we call cross- fitting. The use of sample splitting allows us to use a very broad set of ML predictive methods in solving the auxiliary and main prediction problems, such as random forests, lasso, ridge, deep neural nets, boosted trees, as well as various hybrids and aggregates of these methods (e.g. a hybrid of a random forest and lasso). We illustrate the application of the general theory through application to the leading cases of estimation and inference on the main parameter in a partially linear regression model and estimation and inference on average treatment eff ects and average treatment e ffects on the treated under conditional random assignment of the treatment. These applications cover randomized control trials as a special case. We then use the methods in an empirical application which estimates the e ffect of 401(k) eligibility on accumulated financial assets.
    Keywords: Neyman, orthogonalization, cross-fi t, double machine learning, debiased machine learning, orthogonal score, efficient score, post-machine-learning and post-regularization inference, random forest, lasso, deep learning, neural nets, boosted trees, efficiency, optimality.
    Date: 2016–09–27
  11. By: Swisher IV, S. N.
    Abstract: Motivated by the seminal work of Robert Fogel on U.S. railroads, I reformulate Fogel’s original counter- factual history question on 19th century U.S. economic growth without railroads by treating the transport network as an endogenous equilibrium object. I quantify the effect of the railroad on U.S. growth from its introduction in 1830 to 1861. Specifically, I estimate the output loss in a counterfactual world with- out the technology to build railroads, but retaining the ability to construct the next-best alternative of canals. My main contribution is to endogenize the counterfactual canal network through a decentralized network formation game played by profit-maximizing transport firms. I perform a similar exercise in a world without canals. My counterfactual differs from Fogel’s in three main ways: I develop a structural model of transport link costs that takes heterogeneity in geography into account to determine the cost of unobserved links, the output distribution is determined in the model as a function of transport costs, and the transport network is endogenized as a stable result of a particular network formation game. I find that railroads and canals are strategic complements, not strategic substitutes. Therefore, the output loss can be quite acute when one or the other is missing from the economy. In the set of Nash stable networks, relative to the factual world, the median value of output is 45% lower in the canals only counterfactual and 49% lower in the railroads only counterfactual. With only one of the transportation technologies available, inequality in output across cities would have been lower in variance terms but sharply higher in terms of the maximum-minimum gap. Such a stark output loss is due to two main mechanisms: inefficiency of the decentralized equilibrium due to network externalities and complementarities due to spatial heterogeneity in costs across the two transport modes.
    Keywords: Economic growth, transport infrastructure, network formation games, strategic comple- ments, railroads, counterfactual history, multiple equilibria, computation, simulation
    JEL: E22 O11 N71 L92 R42
    Date: 2017–04–28

This nep-cmp issue is ©2017 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.