nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒11‒26
fourteen papers chosen by
Stan Miles
Thompson Rivers University

  1. An Algorithmic Crystal Ball: Forecasts-based on Machine Learning By Jin-Kyu Jung; Manasa Patnam; Anna Ter-Martirosyan
  2. Predicting and Understanding Initial Play By Drew Fudenberg; Annie Liang
  3. China’s Response to Nuclear Safety Post-Fukushima: Genuine or Rhetoric? By Lam, J.; Cheung, L.; Han, Y.; Wang, S.
  4. The impact of commuting time over educational achievement: A machine learning approach By Dante Contreras; Daniel Hojman; Manuel Matas; Patricio Rodríguez; Nicolás Suárez
  5. A Forward Calibration Method for New Quantitative Trade Models By Pothen, Frank; Hübler, Michael
  6. Robust risk aggregation with neural networks By Stephan Eckstein; Michael Kupper; Mathias Pohl
  7. Predicting Adverse Media Risk using a Heterogeneous Information Network By Ryohei Hisano; Didier Sornette; Takayuki Mizuno
  8. Minimum Wage Policy and Rural Household Welfare in Nigeria By Idiaye, C.; Kuhn, A.; Okoruwa, V.
  9. The evaluation of fiscal consolidation strategies By Matus Senaj; Zuzana Siebertova; Norbert Svarda; Jana Valachyova
  10. Forecasting the implications of foreign exchange reserve accumulation with an agent-based model By Ramis Khabibullin; Alexey Ponomarenko; Sergei Seleznev
  11. Making sense of Piketty's 'Fundamental Laws' in a Post-Keynesian Framework: The transitional dynamics of wealth inequality By Stefan Ederer; Miriam Rehm
  12. The Analysis of Big Data on Cites and Regions - Some Computational and Statistical Challenges By Schintler, Laurie A.; Fischer, Manfred M.
  13. Matlab, Python, Julia: What to Choose in Economics? By Coleman, Chase; Lyon, Spencer; Maliar, Lilia; Maliar, Serguei
  14. Waters run deep: A coupled Revealed Preference and CGE model to assess the economy-wide impacts of agricultural water buyback By Perez Blanco, C.D.

  1. By: Jin-Kyu Jung; Manasa Patnam; Anna Ter-Martirosyan
    Abstract: Forecasting macroeconomic variables is key to developing a view on a country's economic outlook. Most traditional forecasting models rely on fitting data to a pre-specified relationship between input and output variables, thereby assuming a specific functional and stochastic process underlying that process. We pursue a new approach to forecasting by employing a number of machine learning algorithms, a method that is data driven, and imposing limited restrictions on the nature of the true relationship between input and output variables. We apply the Elastic Net, SuperLearner, and Recurring Neural Network algorithms on macro data of seven, broadly representative, advanced and emerging economies and find that these algorithms can outperform traditional statistical models, thereby offering a relevant addition to the field of economic forecasting.
    Date: 2018–11–01
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:18/230&r=cmp
  2. By: Drew Fudenberg (Department of Economics, MIT); Annie Liang (Department of Economics, University of Pennsylvania)
    Abstract: We take a machine learning approach to the problem of predicting initial play in strategicform games, with the goal of uncovering new regularities in play and improving the predictions of existing theories. The analysis is implemented on data from previous laboratory experiments, and also a new data set of 200 games played on Mechanical Turk. We ï¬ rst use machine learning algorithms to train prediction rules based on a large set of game features. Examination of the games where our algorithm predicts play correctly, but the existing models do not, leads us to introduce a risk aversion parameter that we ï¬ nd signiï¬ cantly improves predictive accuracy. Second, we augment existing empirical models by using play in a set of training games to predict how the models’ parameters vary across new games. This modiï¬ ed approach generates better out-of-sample predictions, and provides insight into how and why the parameters vary. These methodologies are not special to the problem of predicting play in games, and may be useful in other contexts.
    Date: 2017–11–14
    URL: http://d.repec.org/n?u=RePEc:pen:papers:18-009&r=cmp
  3. By: Lam, J.; Cheung, L.; Han, Y.; Wang, S.
    Abstract: The Fukushima crisis has brought the nuclear safety problem to the world’s attention. China is the most ambitious country in the world in nuclear power development. How China perceives and responds to nuclear safety issues carries significant implications on its citizens’ safety and security. This paper examines the Chinese government’s promised and actual response to nuclear safety following the Fukushima crisis, based on (1) statistical analysis of newspaper coverage on nuclear energy, and (2) review of nuclear safety performance and safety governance. Our analysis shows that (i) the Chinese government’s concern over nuclear accidents and safety has surged significantly after Fukushima, (ii) China has displayed strengths in reactor technology design and safety operation, and (iii) China’s safety governance has been continuously challenged by institutional fragmentation, inadequate transparency, inadequate safety professionals, weak safety culture, and ambition to increase nuclear capacity by three-fold by 2050. We suggest that China should improve its nuclear safety standards, as well as safety management and monitoring, reform institutional arrangements to reduce fragmentation, improve information transparency, and public trust and participation, strengthen the safety culture, introduce process-based safety regulations, and promote international collaboration to ensure that China’s response to nuclear safety can be fully implemented in real-life.
    Keywords: nuclear safety, media focus, computational text analysis, regulatory governance, safety management
    JEL: C89 Q42 Q48
    Date: 2018–11–05
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:1866&r=cmp
  4. By: Dante Contreras; Daniel Hojman; Manuel Matas; Patricio Rodríguez; Nicolás Suárez
    Abstract: Taking advantage of georeferenced data from Chilean students, we estimate the impact of commuting time over academic achievement. As the commuting time is an endogenous variable, we use instrumental variables and fixed effects at school level to overcome this problem. Also, as we don’t know which mode of transport the students use, we complement our analysis using machine learning methods to predict the transportation mode. Our findings suggest that the commuting time has a negative effect over academic performance, but this effect is not always significant.
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:udc:wpaper:wp472&r=cmp
  5. By: Pothen, Frank; Hübler, Michael
    Abstract: This article introduces an innovative and exible dynamic forward calibration method for disaggregated new quantitative trade models, particularly the Eaton and Kortum model, within a computable general equilibrium framework. The model is parameterized based on distinct, consistent future development scenario assumptions about EU climate policy, economic growth, energy efficiency, the electricity mix and structural change (sectoral shifts) derived through a complex scenario-creation process. The model equations and the scenario assumptions are implemented as side constraints of an optimization problem minimizing the difference between historical and targeted technology levels (sectoral productivities). This method is combined with input-output data disaggregation methods to separate Northwest Germany from the rest of Germany and to represent different power generation technologies. This setup enables the comparison of alternative regional sustainability-oriented long-term policy pathways. Despite the importance of the policy pathways envisaged by Northwest Germany's governments to society, they have limited macroeconomic effects in the simulations. In contrast, the future development scenario assumptions significantly affect European economies, particularly via the EU climate policy costs that drastically increase towards 2050. If Northwest Germany's energy transition fails, then its climate policy costs will increase extraordinarily.
    Keywords: EU climate policy; forward calibration; regional model; structural estimation; new quantitative trade theory
    JEL: C68 F17 L16 O40
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:han:dpaper:dp-643&r=cmp
  6. By: Stephan Eckstein; Michael Kupper; Mathias Pohl
    Abstract: We consider settings in which the distribution of a multivariate random variable is partly ambiguous. We assume the ambiguity lies on the level of dependence structure, and that the marginal distributions are known. Furthermore, a current best guess for the distribution, called reference measure, is available. We work with the set of distributions that are both close to the given reference measure in a transportation distance (e.g. the Wasserstein distance), and additionally have the correct marginal structure. The goal is to find upper and lower bounds for integrals of interest with respect to distributions in this set. The described problem appears naturally in the context of risk aggregation. When aggregating different risks, the marginal distributions of these risks are known and the task is to quantify their joint effect on a given system. This is typically done by applying a meaningful risk measure to the sum of the individual risks. For this purpose, the stochastic interdependencies between the risks need to be specified. In practice the models of this dependence structure are however subject to relatively high model ambiguity. The contribution of this paper is twofold: Firstly, we derive a dual representation of the considered problem and prove that strong duality holds. Secondly, we propose a generally applicable and computationally feasible method, which relies on neural networks, in order to numerically solve the derived dual problem. The latter method is tested on a number of toy examples, before it is finally applied to perform robust risk aggregation in a real world instance.
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1811.00304&r=cmp
  7. By: Ryohei Hisano (Social ICT Research Center, Graduate School of Information Science and Technology, The University of Tokyo); Didier Sornette (ETH Zurich, Department of Management Technology and Economics); Takayuki Mizuno (National Institute of Informatics)
    Abstract: The media plays a central role in monitoring powerful institutions and identifying any activities harmful to the public interest. In the investing sphere constituted of 46,583 officially listed domestic firms on the stock exchanges worldwide, there is a growing interest “to do the right thing”, i.e., to put pressure on companies to improve their environmental, social and government (ESG) practices. However, how to overcome the sparsity of ESG data from non-reporting firms, and how to identify the relevant information in the annual reports of this large universe? Here, we construct a vast heterogeneous information network that covers the necessary information surrounding each firm, which is assembled using seven professionally curated datasets and two open datasets, resulting in about 50 million nodes and 400 million edges in total. Exploiting this heterogeneous information network, we propose a model that can learn from past adverse media coverage patterns and predict the occurrence of future adverse media coverage events on the whole universe of firms. Our approach is tested using the adverse media coverage data of more than 35,000 firms worldwide from January 2012 to May 2018. Comparing with state-of-the-art methods with and without the network, we show that the predictive accuracy is substantially improved when using the heterogeneous information network. This work suggests new ways to consolidate the diffuse information contained in big data in order to monitor dominant institutions on a global scale for more socially responsible investment, better risk management, and the surveillance of powerful institutions.
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:upd:utmpwp:004&r=cmp
  8. By: Idiaye, C.; Kuhn, A.; Okoruwa, V.
    Abstract: Using the Computable General Equilibrium (CGE) framework, this paper showed how urban sector wage rigidities such as the minimum wage policy can impact the rural economy and the welfare of households. A static CGE model of the Nigerian economy was developed to examine the effects of 12%, 30% and 68% minimum wage increases in Nigeria on the economy and the welfare of households, especially informal sector rural households. CGE simulations revealed that with a 12% increase in the minimum wage, domestic output declined in all sectors except the crude oil and mining sector. Similar impacts were observed with 30% and 68% increases but with greater changes. There was also a general decline in labour employment due to its higher price. Most macroeconomic aggregates fell, including GDP and real GDP. Household savings, however, increased in all cases but there were huge inflationary pressures represented by increases in the price index in all three scenarios. Investments also fell, while household utility declined in all three scenarios, indicating that minimum wage policies in the long run, do not result in better household welfare, rather they are left worse off. Acknowledgement : I would like to acknowledge the support of the The Biomass Web Project of the University of Bonn, Germany in collaboration with the German Federal Ministry of Education in completing this research work.
    Keywords: Consumer/Household Economics
    Date: 2018–07
    URL: http://d.repec.org/n?u=RePEc:ags:iaae18:277114&r=cmp
  9. By: Matus Senaj (Council for Budget Responsibility); Zuzana Siebertova (Council for Budget Responsibility); Norbert Svarda (Council for Budget Responsibility); Jana Valachyova (Council for Budget Responsibility)
    Abstract: In this paper, we present a framework and perform an assessment of different fiscal consolidation strategies both on the revenue as well as on the expenditure sides of the budget in the context of Slovakia. The model we use for simulations is a behavioural general-equilibrium what_if model. We analyse the simulated impacts of consolidation strategies on growth and on fiscal balance (both in short- and long- term). The microsimulation approach allows us also to evaluate the distributional impacts. In addition, the approach permits to compare the statutory with the resulting tax incidence in the long-run. We simulate strategies based on taxing labour income, taxing consumption as well as cutting expenditures on social transfers. We document that corporate and labour taxes are more unfavourable to output growth, while consumption taxes belong to less damaging instruments for consolidation. We show that spending cuts may promote employment and are not detrimental to output growth.
    Keywords: microsimulation, general equilibrium, tax and transfer policy, Slovakia
    JEL: C63 H22 I38
    Date: 2018–09
    URL: http://d.repec.org/n?u=RePEc:cbe:wpaper:201802&r=cmp
  10. By: Ramis Khabibullin (Bank of Russia, Russian Federation); Alexey Ponomarenko (Bank of Russia, Russian Federation); Sergei Seleznev (Bank of Russia, Russian Federation)
    Abstract: We develop a stock-flow-consistent agent-based model that comprises a realistic mechanism of money creation and parametrize it to fit actual data. The model is used to make out-of-sample projections of broad money and credit developments under the commencement/termination of foreign reserve accumulation by the Bank of Russia. We use direct forecasts from the agent-based model as well as the two-step approach, which implies the use of artificial data to pre-train the Bayesian vector autoregression model. We conclude that the suggested approach is competitive in forecasting and yields promising results.
    Keywords: Money supply, foreign exchange reserves, forecasting, agent-based model, Russia.
    JEL: C53 C63 E51 E58 F31 G21
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:bkr:wpaper:wps37&r=cmp
  11. By: Stefan Ederer; Miriam Rehm
    Abstract: If Piketty's main theoretical prediction (r>g leads to rising wealth inequality) is taken to its radical conclusion, then a small elite will own all wealth if capitalism is left to its own devices. We formulate and calibrate a Post-Keynesian model with an endogenous distribution of wealth between workers and capitalists which permits such a corner solution of all wealth held by capitalists. However, it also shows interior solutions with a stable, non-zero wealth share of workers, a stable wealth-to-income ratio, and a stable and positive gap between the profit and the growth rate determined by the Cambridge equation. More importantly, simulations show that the model conforms to Piketty's empirical findings during a transitional phase of increasing wealth inequality, which characterizes the current state of high-income countries: The wealth share of capitalists rises to over 60%, the wealth-to-income ratio increases, and income inequality rises. Finally, we show that the introduction of a wealth tax as suggested by Piketty could neutralize this rise in wealth concentration predicted by our model.
    Keywords: Post-Keynesian, model, wealth, saving, inequality, Piketty, simulation
    JEL: C63 D31 E12 E21
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:imk:fmmpap:35-2018&r=cmp
  12. By: Schintler, Laurie A.; Fischer, Manfred M.
    Abstract: Big Data on cities and regions bring new opportunities and challenges to data analysts and city planners. On the one side, they hold great promise to combine increasingly detailed data for each citizen with critical infrastructures to plan, govern and manage cities and regions, improve their sustainability, optimize processes and maximize the provision of public and private services. On the other side, the massive sample size and high-dimensionality of Big Data and their geo-temporal character introduce unique computational and statistical challenges. This chapter provides overviews on the salient characteristics of Big Data and how these features impact on paradigm change of data management and analysis, and also on the computing environment.
    Keywords: massive sample size, high-dimensional data, heterogeneity and incompleteness, data storage, scalability, parallel data processing, visualization, statistical methods
    Date: 2018–10–28
    URL: http://d.repec.org/n?u=RePEc:wiw:wus046:6637&r=cmp
  13. By: Coleman, Chase; Lyon, Spencer; Maliar, Lilia; Maliar, Serguei
    Abstract: We perform a comparison of Matlab, Python and Julia as programming languages to be used for implementing global nonlinear solution techniques. We consider two popular applications: a neoclassical growth model and a new Keynesian model. The goal of our analysis is twofold: First, it is aimed at helping researchers in economics to choose the programming language that is best suited to their applications and, if needed, help them transit from one programming language to another. Second, our collections of routines can be viewed as a toolbox with a special emphasis on techniques for dealing with high dimensional economic problems. We provide the routines in the three languages for constructing random and quasi-random grids, low-cost monomial integration, various global solution methods, routines for checking the accuracy of the solutions, etc. Our global solution methods are not only accurate but also fast. Solving a new Keynesian model with eight state variables only takes a few seconds, even in the presence of active zero lower bound on nominal interest rates. This speed is important because it then allows the model to be solved repeatedly as one would require in order to do estimation.
    Keywords: Dynamic programming; Global solution; High dimensionality; Julia; Large scale; Matlab; Nonlinear; Python; Toolkit; Value function iteration
    JEL: C6 C61 C63 C68 E31 E52
    Date: 2018–09
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:13210&r=cmp
  14. By: Perez Blanco, C.D.
    Abstract: Little is known about the economy-wide repercussions of water buyback, which may include relevant feedbacks on the output of economic sectors at a regional and supra-regional scale. Limited applied studies available rely on stand-alone Computable General Equilibrium (CGE) models that represent competition for water explicitly, but this approach presents significant data and methodological challenges in areas where mature water markets are not in place the case of most regions worldwide. To bridge this gap, this paper couples a bottom-up Revealed Preference Model that elicits the value and price share to water with a top-down, regionally-calibrated CGE model for Spain. Methods are illustrated with a case study in the Murcia Region in southeastern Spain. Economy-wide feedbacks amplify income losses in Murcia's agriculture from -20.5% in the bottom-up model up to -33% in the coupled model. Compensations paid to irrigators enhance demand in the region, but supply contraction in agriculture and related sectors lead to a GDP loss (up to -2.1%) in most scenarios. The supply gap is partially filled in by other Spanish regions, which experience a GDP gain through a substitution effect (up to +.034%). In all scenarios, aggregate GDP for Spain decreases (up to -.023%). Acknowledgement : This research has received funding from the AXA Research Fund through the Post-Doctoral Fellowships Campaign 2015, and from the Climate-KIC Europe through the Climate Smart Agriculture Booster project AGRO ADAPT (Service for local and economy wide assessment of adaptation actions in agriculture).
    Keywords: Marketing
    Date: 2018–07
    URL: http://d.repec.org/n?u=RePEc:ags:iaae18:277028&r=cmp

This nep-cmp issue is ©2018 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.