nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒06‒25
ten papers chosen by
Stan Miles
Thompson Rivers University

  1. Computational evidence on the distributive properties of monetary policy By Chen, Siyan; Desiderio, Saul
  2. Exploring Long Run Structural Change with a Dynamic General Equilibrium Model By Wolfgang Britz; Roberto Roson
  3. Economic Impact of Tariff Hikes - A CGE model analysis - By Kenichi Kawasaki
  4. Data Science for Institutional and Organizational Economics By Prüfer, Jens; Prüfer, Patricia
  5. Neural networks for stock price prediction By Yue-Gang Song; Yu-Long Zhou; Ren-Jie Han
  6. Too Fast, Too Furious? Algorithmic Trading and Financial Instability By Lise Arena; Nathalie Oriol; Iryna Veryzhenko
  7. Regulating speculative housing markets via public housing construction programs: Insights from a heterogeneous agent model By Martin, Carolin; Westerhoff, Frank
  8. Business Cycle Uncertainty and Economic Welfare Revisited By Christopher Heiberger; Alfred Maussner
  9. Changing levels of self-organization : how a capitalist economy differs from other complex systems By Leonardo Costa Ribeiro; Leonardo Gomes de Deus; Pedro Mendes Loureiro; Eduardo da Motta e Albuquerque
  10. Lifting the Curtain: Backstage Cognition, Frontstage Behavior, and the Interpersonal Transmission of Culture By Lu, Richard; Chatman, Jennifer A.; Goldberg, Amir; Srivastava, Sameer B.

  1. By: Chen, Siyan; Desiderio, Saul
    Abstract: Empirical studies have pointed out that monetary policy may significantly affect income and wealth inequality. To investigate the distributive properties of monetary policy the authors resort to an agent-based macroeconomic model where firms, households and one bank interact on the basis of limited information and adaptive rules-of-thumb. Simulations show that the model can replicate fairly well a number of stylized facts, specially those relative to the business cycle. The authors address the issue using three types of computational experiments, including a global sensitivity analysis carried out through a novel methodology which greatly reduces the computational burden of simulations. The result emerges that a more restrictive monetary policy increases inequality, even though this effect may differ across groups of households. This may put into question the principle of the independence of central banks. In addition, this effect appears to be attenuated if the bank's willingness to lend is lower.
    Keywords: economic inequality,monetary policy,agent-based models,NK-DSGE models,stock-flow consistency,global sensitivity analysis
    JEL: C63 D31 D50 E52
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201838&r=cmp
  2. By: Wolfgang Britz (Institute for Food and Resource Economics, University of Bonn); Roberto Roson (Department of Economics, University Of Venice Cà Foscari; IEFE Bocconi University)
    Abstract: In this paper we present a computable general equilibrium model (G-RDEM), specifically designed for the generation of long run scenarios of economic development, featuring a non-homothetic demand system, endogenous saving rates, differentiated industrial productivity growth, interest payments on foreign debt and time-varying input-output coefficients. To the best of our knowledge, this is the first model of this kind. We illustrate how parameters of the five modules of structural change have been estimated, and we test the model by comparing its results with those obtained by a more conventional recursive dynamic CGE model. Both models are driven by the same GDP and population data, exogenously provided by the IPCC Shared Socio-economic Pathway 3. GDP levels determine the endogenous productivity parameters. Population affects the definition of per capita income, which in turn affects the household demand system and the variation of input-output coefficients. Information on the demographic structure is also employed to modify the aggregate saving rate parameters. It is found that the two models do produce different findings, both globally and at the regional and industrial level. Understanding the origins of such differences sheds some light on how mechanisms of structural change operate in the long run.
    Keywords: Computable General Equilibrium models; Long-run economic scenarios; Structural change
    JEL: C68 C82 C88 D58 E17 F43 O11 O40
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:ven:wpaper:2018:12&r=cmp
  3. By: Kenichi Kawasaki (National Graduate Institute for Policy Studies, Tokyo, Japan)
    Abstract: The US imposed US tariffs on steel and aluminum imports in March 2018. An estimate of the economic impact of tariff hikes made using a Computable General Equilibrium (CGE) model of global trade, incorporating a dynamic capital formation mechanism, indicates that US import tariffs could protect the relevant US sectors but would have a negative impact on the US economy at the macro level. This key policy finding could not be attributed to the conventional framework of fixed labor in a CGE model. Also, a sensitivity analysis using a CGE model indicates that international capital movements would differentiate the impact of tariff hikes among countries. Trade deficits themselves would not necessarily be of much concern given the somewhat compensatory benefits of international capital inflows. On the other hand, possible capital outflows could exaggerate the adverse effects of tariff hikes. It is estimated here that for an import tariff hike of one percentage point worldwide, global trade would decrease by around 1.7 per cent and global GDP would decrease by around 0.2 per cent. It is of concern that emergent protectionism would reduce the growth of both global trade and the global economy.
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:ngi:dpaper:18-05&r=cmp
  4. By: Prüfer, Jens (Tilburg University, TILEC); Prüfer, Patricia (Tilburg University, TILEC)
    Abstract: To which extent can data science methods – such as machine learning, text analysis, or sentiment analysis – push the research frontier in the social sciences? This essay briefly describes the most prominent data science techniques that lend themselves to analyses of institutional and organizational governance structures. We elaborate on several examples applying data science to analyze legal, political, and social institutions and sketch how specific data science techniques can be used to study important research questions that could not (to the same extent) be studied without these techniques. We conclude by comparing the main strengths and limitations of computational social science with traditional empirical research methods and its relation to theory.
    Keywords: data science; maching learning; institutions; text analysis
    JEL: C50 C53 C87 D02 K0
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:tiu:tiutil:4392ac65-4fb6-4e9a-a92d-5da46339c7a9&r=cmp
  5. By: Yue-Gang Song; Yu-Long Zhou; Ren-Jie Han
    Abstract: Due to the extremely volatile nature of financial markets, it is commonly accepted that stock price prediction is a task full of challenge. However in order to make profits or understand the essence of equity market, numerous market participants or researchers try to forecast stock price using various statistical, econometric or even neural network models. In this work, we survey and compare the predictive power of five neural network models, namely, back propagation (BP) neural network, radial basis function (RBF) neural network, general regression neural network (GRNN), support vector machine regression (SVMR), least squares support vector machine regresssion (LS-SVMR). We apply the five models to make price prediction of three individual stocks, namely, Bank of China, Vanke A and Kweichou Moutai. Adopting mean square error and average absolute percentage error as criteria, we find BP neural network consistently and robustly outperforms the other four models.
    Date: 2018–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1805.11317&r=cmp
  6. By: Lise Arena (GREDEG - Groupe de Recherche en Droit, Economie et Gestion - UNS - Université Nice Sophia Antipolis - UCA - Université Côte d'Azur - CNRS - Centre National de la Recherche Scientifique); Nathalie Oriol (LIRSA - Laboratoire Interdisciplinaire de Recherche en Sciences de l'Action - CNAM - Conservatoire National des Arts et Métiers [CNAM]); Iryna Veryzhenko (LIFL - Laboratoire d'Informatique Fondamentale de Lille - Université de Lille, Sciences et Technologies - Inria - Institut National de Recherche en Informatique et en Automatique - Université de Lille, Sciences Humaines et Sociales - CNRS - Centre National de la Recherche Scientifique)
    Abstract: To what extent can algorithmic trading-based strategies explain the propagation of flash crashes on financial markets? This question has to be discussed at the intersection of two disciplinary fields: management of information systems and finance. Built on realistic assumptions on traders' strategies, on their use of algorithmic information systems and considering the role of transactions systems at the market level, an agent-based approach is presented. Final results show that speed-oriented trading strategies and the increasing use of new trading technologies can arm markets' stability and resiliency, facing intraday operational shocks. The article also shows the central role played by transactions systems in the propagation of flash crashes, when a new regulation based on the principle of decimalization is introduced.
    Keywords: High frequency trading strategies,Flash crash,Information technologies,Agent-based approach
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:hal:journl:halshs-01789636&r=cmp
  7. By: Martin, Carolin; Westerhoff, Frank
    Abstract: Since the instability of housing markets may be quite harmful for the real economy, we explore whether public housing construction programs may tame housing market fluctuations. As a workhorse, we use a behavioral stock-flow housing market model in which the complex interplay between speculative and real forces triggers realistic housing market dynamics. Simulations reveal that plausible and well-intended policy measures may turn out to be a mixed blessing. While public housing construction programs may reduce house prices, they seem to be incapable of bringing house prices much closer towards their fundamental values. In addition, these programs tend to drive out private housing constructions.
    Keywords: housing markets,boom-bust dynamics,extrapolative and regressive expectations,heterogeneous agent model,policy experiments,public housing construction programs
    JEL: D84 R21 R31
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:zbw:bamber:135&r=cmp
  8. By: Christopher Heiberger (University of Augsburg, Department of Economics); Alfred Maussner (University of Augsburg, Department of Economics)
    Abstract: Cho, Cooley, and Kim (RED, 2015) (CCK) consider the welfare effects of removing multiplicative productivity shocks from real business cycle models. In a model that admits an analytical solution they argue convincingly that the positive welfare effect of removing uncertainty can be dominated by a negative mean effect arising from the optimal response of household labor supply. While the presentation of this model is quite elaborate, the details of their subsequent quantitative analysis of several versions of the standard real business cycle model remain vague. We lay out the general procedure of computing second-order accurate approximations of welfare gains or losses in the canonical dynamic stochastic general equilibrium model. In order to be able to consider mean preserving increases in the size of shocks we extend the computation of second-order approximations of the policy functions pioneered by Schmitt-Grohé and Uribe (JEDC, 2004). Our computations show that different from the results reported in CCK the mean effect never dominates the fluctuations effect. Welfare measures computed from weighted residuals methods confirm the logic behind our perturbation approach and verify the accuracy of our estimates.
    Keywords: business cycles, mean effect, second order solution, risk aversion, welfare costs
    JEL: C63 D60 E32
    Date: 2018–06
    URL: http://d.repec.org/n?u=RePEc:aug:augsbe:0335&r=cmp
  9. By: Leonardo Costa Ribeiro (Inmetro-RJ); Leonardo Gomes de Deus (Cedeplar-UFMG); Pedro Mendes Loureiro (SOAS-UK); Eduardo da Motta e Albuquerque (Cedeplar-UFMG)
    Abstract: This paper investigates the specificity of a modern capitalist economy as a complex system. This investigation is based on a review of the literature to understand complexity in the physical and biological worlds, to learn the bases of complexity and self-organization, and to locate the tools used to identify and measure them. An economy has different layers and levels of organization, based upon human beings, agents that think, have intentions and change all the time. This paper presents a new model to replicate the workings of a capitalist economy that endogenizes the introduction of innovations and institutional change, and it runs a simulation. The analysis of those results indicate that a modern capitalist economy is a complex system, and that this complex system changes its level of self-organization over time. This finding highlights the peculiarity of a capitalist economy vis-à-vis other complex systems.
    Keywords: rate of profit; technological revolutions; Marx; complex systems; metamorphoses of capitalism; simulation models.
    JEL: P16 O33 B51
    Date: 2018–05
    URL: http://d.repec.org/n?u=RePEc:cdp:texdis:td581&r=cmp
  10. By: Lu, Richard (University of California, Berkeley); Chatman, Jennifer A. (University of California, Berkeley); Goldberg, Amir (Stanford University); Srivastava, Sameer B. (University of California, Berkeley)
    Abstract: From the schoolyard to the boardroom, the pressures of cultural assimilation pervade all walks of social life. Yet people vary in the capacity to fit in culturally, and their fit can wax and wane over time. We examine how individual cognition and social influence produce variation and change in cultural fit. We do so by lifting the curtain between the backstage (cognition) and frontstage (behavior) of cultural fit. We theorize that the backstage comprises two analytically distinct dimensions--perceptual accuracy and value congruence--and that the former matters for normative compliance on the frontstage, whereas the latter does not. We further propose that a person's behavior and perceptual accuracy are both influenced by observations of others' behavior, whereas value congruence is less susceptible to peer influence. Drawing on email and survey data from a mid-sized technology firm, we use the tools of computational linguistics and machine learning to develop longitudinal measures of frontstage and backstage cultural fit. We also take advantage of a reorganization that produced quasi-exogenous shifts in employees' peer groups to identify the causal impact of social influence.
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:ecl:stabus:repec:ecl:stabus:3603&r=cmp

This nep-cmp issue is ©2018 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.