nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒08‒20
twelve papers chosen by

  1. Agent-Based Model Calibration using Machine Learning Surrogates By Francesco Lamperti; Andrea Roventini; Amir Sani
  2. Mobility in Cities: Distributional Impact Analysis of Transportation Improvement in São Paulo Metropolitan Region By Eduardo A. Haddad; Nancy Lozano-Gracia, Eduardo Germani, Renato S. Vieira, Shobei Nakaguma,; Emmanuel Skoufias, Bianca Bianchi Alves
  3. Modelling Electric Vehicles as an Abatement Technology in a Hybrid CGE Model By Stefan Schmelzer; Michael Miess; Vedunka Kopecna; Milan Scasny
  4. Deep Learning-Based BSDE Solver for Libor Market Model with Application to Bermudan Swaption Pricing and Hedging By Haojie Wang; Han Chen; Agus Sudjianto; Richard Liu; Qi Shen
  5. Artificial Intelligence, Economics, and Industrial Organization By Hal Varian
  6. What drives markups? Evolutionary pricing in an agent-based stock-flow consistent macroeconomic model By Pascal Seppecher; Isabelle Salle; Marc Lavoie
  8. Generic Machine Learning Inference on Heterogenous Treatment Effects in Randomized Experiments By Victor Chernozhukov; Mert Demirer; Esther Duflo; Iván Fernández-Val
  9. Lattice Studies of Gerrymandering Strategies By Kyle Gatesman; James Unwin
  10. Stock Price Correlation Coefficient Prediction with ARIMA-LSTM Hybrid Model By Hyeong Kyu Choi
  11. Take a Look Around: Using Street View and Satellite Images to Estimate House Prices By Stephen Law; Brooks Paige; Chris Russell
  12. Hamiltonian Flow Simulation of Rare Events By Raphaël Douady; Shohruh Miryusupov

  1. By: Francesco Lamperti (Laboratory of Economics and Management (LEM) - Scuola Superiore Sant'Anna [Pisa]); Andrea Roventini (OFCE - Observatoire Français des Conjonctures économiques - Institut d'Études Politiques [IEP] - Paris - Fondation Nationale des Sciences Politiques [FNSP], Laboratory of Economics and Management (LEM) - Scuola Superiore Sant'Anna [Pisa]); Amir Sani (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique)
    Abstract: Taking agent-based models (ABM) closer to the data is an open challenge. This paper explicitly tackles parameter space exploration and calibration of ABMs combining supervised machine-learning and intelligent sampling to build a surrogate meta-model. The proposed approach provides a fast and accurate approximation of model behaviour, dramatically reducing computation time. In that, our machine-learning surrogate facilitates large scale explorations of the parameter-space, while providing a powerful filter to gain insights into the complex functioning of agent-based models. The algorithm introduced in this paper merges model simulation and output analysis into a surrogate meta-model, which substantially ease ABM calibration. We successfully apply our approach to the Brock and Hommes (1998) asset pricing model and to the " Island " endogenous growth model (Fagiolo and Dosi, 2003). Performance is evaluated against a relatively large out-of-sample set of parameter combinations, while employing different user-defined statistical tests for output analysis. The results demonstrate the capacity of machine learning surrogates to facilitate fast and precise exploration of agent-based models' behaviour over their often rugged parameter spaces.
    Keywords: meta-model,agent based model,surrogate,calibration,machine learning
    Date: 2017–04–03
  2. By: Eduardo A. Haddad; Nancy Lozano-Gracia, Eduardo Germani, Renato S. Vieira, Shobei Nakaguma,; Emmanuel Skoufias, Bianca Bianchi Alves
    Abstract: This paper evaluates the impacts of transportation investments/policies using a spatial computable general equilibrium (SCGE) model integrated to a travel demand model. In order to enhance our understanding of the distributional impacts of transportation improvements in Brazilian cities, we simulate the impact of different types of mobility investments in the São Paulo Metropolitan Region (SPMR). To explore further the income effects of infrastructure investments, we also conduct microsimulation exercises integrated to the SCGE results. We look at 10 different scenarios, ranging from a series of infrastructure-related interventions – considering the expansion of the mass-transit public transportation network – to policies that focus on monetary disincentives to the use of cars. The simulations results suggest trade-offs between efficiency and equity.
    Keywords: General equilibrium; urban mobility; accessibility; productivity; transportation infrastructure.
    JEL: C63 C68 R13 R42
    Date: 2018–07–10
  3. By: Stefan Schmelzer (Institute for Advanced Studies, Vienna; Institute for Ecological Economics, WU - Vienna University of Economics and Business); Michael Miess (Institute for Advanced Studies, Vienna; Institute for Ecological Economics, WU - Vienna University of Economics and Business; Complexity Science Hub Vienna); Vedunka Kopecna (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic); Milan Scasny (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nabrezi 6, 111 01 Prague 1, Czech Republic)
    Abstract: We present a novel methodology to quantify the social costs and benefits (net social costs) of electric vehicles as an endogenous, demand-driven abatement technology in a general equilibrium framework. This new costing approach relates general equilibrium effects resulting from an increased market penetration of electric vehicles to the external environmental and health effects of the corresponding change in emissions. To this end, we develop a hybrid model combining a computable general equilibrium (CGE) with a discrete choice (DC) model that is capable of depicting an endogenous demand-driven uptake of alternative fuel vehicles. The discrete choice model of the consumer purchase decision between conventional, hybrid, plug-in hybrid, and electric vehicles is directly integrated into the CGE model. This hybrid CGE-DC model features a detailed accounting of vehicle fleet development, including yearly numbers of vehicle purchases and cohort depreciation. It depicts nine households differentiated by the degree of urbanization and education, accounts for detailed consumer preferences for the purchase of a passenger vehicle and mode choice decisions. The hybrid CGE-DC model is additionally hard-linked to a bottom-up module for elektricity production by several technologies to provide input for an established impact pathway analysis to quantify the external costs relating to the changed composition of the vehicle fleet and technologies to generate electricity. We apply this methodology to Austria as an empirical example, considering current measures and trends for the uptake of electric vehicles into the vehicle fleet. In particular, we quantify the net social costs of additional measures to foster the introduction of electromobility that are part of the current policy discussion in Austria, and thus provide a blueprint for further application in different national contexts.
    Keywords: hybrid CGE model; discrete choice; electric vehicles; environmental benefits
    JEL: C68 D12 D58 H22 H23 Q43 Q52 R42
    Date: 2018–08
  4. By: Haojie Wang; Han Chen; Agus Sudjianto; Richard Liu; Qi Shen
    Abstract: The Libor market model is a mainstay term structure model of interest rates for derivatives pricing, especially for Bermudan swaptions, and other exotic Libor callable derivatives. For numerical implementation the pricing of derivatives with Libor market models is mainly carried out with Monte Carlo simulation. The PDE grid approach is not particularly feasible due to Curse of Dimensionality. The standard Monte Carlo method for American/Bermudan swaption pricing more or less uses regression to estimate expected value as a linear combination of basis functions (Longstaff and Schwartz). However, Monte Carlo method only provides the lower bound for American option price. Another complexity is the computation of the sensitivities of the option, the so-called Greeks, which are fundamental for a trader's hedging activity. Recently, an alternative numerical method based on deep learning and backward stochastic differential equations appeared in quite a few researches. For European style options the feedforward deep neural networks (DNN) show not only feasibility but also efficiency to obtain both prices and numerical Greeks. In this paper, a new backward DNN solver is proposed for Bermudan swaptions. Our approach is representing financial pricing problems in the form of high dimensional stochastic optimal control problems, FBSDEs, or equivalent PDEs. We demonstrate that using backward DNN the high-dimension Bermudan swaption pricing and hedging can be solved effectively and efficiently. A comparison between Monte Carlo simulation and the new method for pricing vanilla interest rate options manifests the superior performance of the new method. We then use this method to calculate prices and Greeks of Bermudan swaptions as a prelude for other Libor callable derivatives.
    Date: 2018–07
  5. By: Hal Varian
    Abstract: Machine learning (ML) and artificial intelligence (AI) have been around for many years. However, in the last 5 years, remarkable progress has been made using multilayered neural networks in diverse areas such as image recognition, speech recognition, and machine translation. AI is a general purpose technology that is likely to impact many industries. In this chapter I consider how machine learning availability might affect the industrial organization of both firms that provide AI services and industries that adopt AI technology. My intent is not to provide an extensive overview of this rapidly-evolving area, but instead to provide a short summary of some of the forces at work and to describe some possible areas for future research.
    JEL: L0
    Date: 2018–07
  6. By: Pascal Seppecher (CEPN - Centre d'Economie de l'Université Paris Nord - UP13 - Université Paris 13 - USPC - Université Sorbonne Paris Cité - CNRS - Centre National de la Recherche Scientifique); Isabelle Salle (Utrecht School of Economics - Utrecht University [Utrecht]); Marc Lavoie (CEPN - Centre d'Economie de l'Université Paris Nord - UP13 - Université Paris 13 - USPC - Université Sorbonne Paris Cité - CNRS - Centre National de la Recherche Scientifique)
    Abstract: This paper studies coordination between firms in a multi-sectoral macroeconomic model with endogenous business cycles. Firms are both in competition and interdependent, and set their prices with a markup over unit costs. Markups are heterogeneous and evolve under market pressure. We observe a systematic coordination within firms in each sector, and between each sector. The resulting pattern of relative prices are consistent with the labor theory of value. Those emerging features are robust to technology shocks.
    Keywords: General interdependence, Pricing, Agent-based modeling
    Date: 2017–03–10
  7. By: Daniela Marella
    Abstract: PC algorithm is one of the most known procedures for Bayesian networks structural learning. The structure is inferred carrying out several independence tests on a database and building a Bayesian network in agreement with the tests results. The PC algorithm is based on the assumption of independent and identically distributed observations. In practice, sample selection in surveys involves more complex sampling designs, then the standard test procedure is not valid even asymptotically. In order to avoid misleading results about the true causal structure the sample selection process must be taken into account in the structural learning process. In this paper, a modi ed version of the PC algorithm is proposed for inferring casual structure from complex survey data. It is based on resampling techniques for nite population. A simulation experiment showing the robustness with respect to departures from the assumptions and the good performance of the proposed algorithm is carried out.
    Keywords: Bayesian network; complex survey data; pseudo-population; structural learning.
    JEL: C10 C12 C18 C83
    Date: 2018–07
  8. By: Victor Chernozhukov; Mert Demirer; Esther Duflo; Iván Fernández-Val
    Abstract: We propose strategies to estimate and make inference on key features of heterogeneous effects in randomized experiments. These key features include best linear predictors of the effects using machine learning proxies, average effects sorted by impact groups, and average characteristics of most and least impacted units. The approach is valid in high dimensional settings, where the effects are proxied by machine learning methods. We post-process these proxies into the estimates of the key features. Our approach is generic, it can be used in conjunction with penalized methods, deep and shallow neural networks, canonical and new random forests, boosted trees, and ensemble methods. It does not rely on strong assumptions. In particular, we don’t require conditions for consistency of the machine learning methods. Estimation and inference relies on repeated data splitting to avoid overfitting and achieve validity. For inference, we take medians of p-values and medians of confidence intervals, resulting from many different data splits, and then adjust their nominal level to guarantee uniform validity. This variational inference method is shown to be uniformly valid and quantifies the uncertainty coming from both parameter estimation and data splitting. An empirical application to the impact of micro-credit on economic development illustrates the use of the approach in randomized experiments.
    JEL: C18 C21 D14 G21 O16
    Date: 2018–06
  9. By: Kyle Gatesman; James Unwin
    Abstract: We propose three novel gerrymandering algorithms which incorporate the spatial distribution of voters with the aim of constructing gerrymandered, equal-population, connected districts. Moreover, we develop lattice models of voter distributions, based on analogies to electrostatic potentials, in order to compare different gerrymandering strategies. Due to the probabilistic population fluctuations inherent to our voter models, Monte Carlo methods can be applied to the districts constructed via our gerrymandering algorithms. Through Monte Carlo studies we quantify the effectiveness of each of our gerrymandering algorithms and we also argue that gerrymandering strategies which do not include spatial data lead to (legally prohibited) highly disconnected districts. Of the three algorithms we propose, two are based on different strategies for packing opposition voters, and the third is a new approach to algorithmic gerrymandering based on genetic algorithms, which automatically guarantees that all districts are connected. Furthermore, we use our lattice voter model to examine the effectiveness of isoperimetric quotient tests and our results provide further quantitative support for implementing compactness tests in real-world political redistricting.
    Date: 2018–08
  10. By: Hyeong Kyu Choi
    Abstract: Predicting the price correlation of two assets for future time periods is important in portfolio optimization. We apply LSTM recurrent neural networks (RNN) in predicting the stock price correlation coefficient of two individual stocks. RNNs are competent in understanding temporal dependencies. The use of LSTM cells further enhances its long term predictive properties. To encompass both linearity and nonlinearity in the model, we adopt the ARIMA model as well. The ARIMA model filters linear tendencies in the data and passes on the residual value to the LSTM model. The ARIMA LSTM hybrid model is tested against other traditional predictive financial models such as the full historical model, constant correlation model, single index model and the multi group model. In our empirical study, the predictive ability of the ARIMA-LSTM model turned out superior to all other financial models by a significant scale. Our work implies that it is worth considering the ARIMA LSTM model to forecast correlation coefficient for portfolio optimization.
    Date: 2018–08
  11. By: Stephen Law; Brooks Paige; Chris Russell
    Abstract: When an individual purchases a home, they simultaneously purchase its structural features, its accessibility to work, and the neighborhood amenities. Some amenities, such as air quality, are measurable whilst others, such as the prestige or the visual impression of a neighborhood, are difficult to quantify. Despite the well-known impacts intangible housing features have on house prices, limited attention has been given to systematically quantifying these difficult to measure amenities. Two issues have lead to this neglect. Not only do few quantitative methods exist that can measure the urban environment, but that the collection of such data is both costly and subjective. We show that street image and satellite image data can capture these urban qualities and improve the estimation of house prices. We propose a pipeline that uses a deep neural network model to automatically extract visual features from images to estimate house prices in London, UK. We make use of traditional housing features such as age, size and accessibility as well as visual features from Google Street View images and Bing aerial images in estimating the house price model. We find encouraging results where learning to characterize the urban quality of a neighborhood improves house price prediction, even when generalizing to previously unseen London boroughs. We explore the use of non-linear vs. linear methods to fuse these cues with conventional models of house pricing, and show how the interpretability of linear models allows us to directly extract the visual desirability of neighborhoods as proxy variables that are both of interest in their own right, and could be used as inputs to other econometric methods. This is particularly valuable as once the network has been trained with the training data, it can be applied elsewhere, allowing us to generate vivid dense maps of the desirability of London streets.
    Date: 2018–07
  12. By: Raphaël Douady (CNRS - Centre National de la Recherche Scientifique, CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique); Shohruh Miryusupov (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique)
    Abstract: Hamiltonian Flow Monte Carlo(HFMC) methods have been implemented in engineering, biology and chemistry. HFMC makes large gradient based steps to rapidly explore the state space. The application of the Hamiltonian dynamics allows to estimate rare events and sample from target distributions defined as the change of measures. The estimates demonstrated a variance reduction of the presented algorithm and its efficiency with respect to a standard Monte Carlo and interacting particle based system(IPS). We tested the algorithm on the case of the barrier option pricing.
    Keywords: Hamiltonian system,Hamiltonian Flow Monte Carlo, Particle Monte Carlo, Sequential Monte Carlo, Monte Carlo, rare events, option pricing, diffusion dynamics
    Date: 2017

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.