nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒10‒01
sixteen papers chosen by
Stan Miles
Thompson Rivers University

  1. A proof that artificial neural networks overcome the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations By Philipp Grohs; Fabian Hornung; Arnulf Jentzen; Philippe von Wurstemberger
  2. Artificial Neural Network Based Chaotic Generator Design for The Prediction of Financial Time Series By Lei Zhang
  3. multishell: Running simulations efficiently using Stata’s shell command By Jan Ditzen
  4. Agricultural Technology Assessment for Smallholder Farms in Developing Countries: An Analysis using a Farm Simulation Model (FARMSIM) By Bizimana, Jean-Claude; Richardson, James W.
  5. No man is an island : the impact of heterogeneity and local interactions on macroeconomics dynamics By Mattia Guerini; Mauro Napoletano; Andrea Roventini
  6. Labor tax reductions in Europe: The role of property taxation By Bielecki, Marcin; Stähler, Nikolai
  7. Deep Reinforcement Learning in High Frequency Trading By Prakhar Ganesh; Puneet Rakheja
  8. Measuring Systematic Risk with Neural Network Factor Model By Jeonggyu Huh
  9. The NEU Meta-Algorithm for Geometric Learning with Applications in Finance By Anastasis Kratsios; Cody B. Hyndman
  10. Assessing the Real Value of H2A Farm Labor Inputs: A Simulation-Optimization Approach By Rusiana, Hofner D.; Escalante, Cesar L.
  11. Agent-based Model of Bt Corn Adoption and Insect Resistance Management By Saikai, Yuji; Mitchell, Paul D.
  12. Predicting Credit Demand with ARMS: A Machine Learning Approach By Ifft, Jennifer E.; Kuhns, Ryan; Patrick, Kevin T.
  13. Semiparametric Panel Data Using Neural Networks By Crane-Droesch, Andrew
  14. Flowbca : A flow-based cluster algorithm in Stata By J. Meekes; W.H.J. Hassink
  15. BSE: A Minimal Simulation of a Limit-Order-Book Stock Exchange By Dave Cliff
  16. A Language for Large-Scale Collaboration in Economics: A Streamlined Computational Representation of Financial Models By Jorge Faleiro

  1. By: Philipp Grohs; Fabian Hornung; Arnulf Jentzen; Philippe von Wurstemberger
    Abstract: Artificial neural networks (ANNs) have very successfully been used in numerical simulations for a series of computational problems ranging from image classification/image recognition, speech recognition, time series analysis, game intelligence, and computational advertising to numerical approximations of partial differential equations (PDEs). Such numerical simulations suggest that ANNs have the capacity to very efficiently approximate high-dimensional functions and, especially, such numerical simulations indicate that ANNs seem to admit the fundamental power to overcome the curse of dimensionality when approximating the high-dimensional functions appearing in the above named computational problems. There are also a series of rigorous mathematical approximation results for ANNs in the scientific literature. Some of these mathematical results prove convergence without convergence rates and some of these mathematical results even rigorously establish convergence rates but there are only a few special cases where mathematical results can rigorously explain the empirical success of ANNs when approximating high-dimensional functions. The key contribution of this article is to disclose that ANNs can efficiently approximate high-dimensional functions in the case of numerical approximations of Black-Scholes PDEs. More precisely, this work reveals that the number of required parameters of an ANN to approximate the solution of the Black-Scholes PDE grows at most polynomially in both the reciprocal of the prescribed approximation accuracy $\varepsilon > 0$ and the PDE dimension $d \in \mathbb{N}$ and we thereby prove, for the first time, that ANNs do indeed overcome the curse of dimensionality in the numerical approximation of Black-Scholes PDEs.
    Date: 2018–09
  2. By: Lei Zhang (University of Regina)
    Abstract: series. The ANN architecture is usually designed and optimized based on trial and error using a given training data set. It is generally required to obtain big data for ANN training in order to achieve good training performance. Financial time series are subject to highly complex conditions of external inputs and their dynamic features can change fast and unpredictably. The aim of this research is to design an adaptive ANN architecture, which can be trained in real time with short time series for near future prediction. ANN based chaotic system generator is designed for the simulation and analysis of the dynamic features in financial time series.
    Keywords: Aritificial Neural Network (ANN), chaotic generator, financial time series, prediction, optimizaiton
    JEL: C45 C52 C61
    Date: 2018–06
  3. By: Jan Ditzen (Heriot-Watt University)
    Abstract: The package multishell is intended to speed up simulations by making use of multi-core processors and Stata’s shell command. In a first step, one or multiple do files are converted into batch files and added to a queue. After starting the main command, the current instance of Stata acts as an organiser and works through the queue. It allocates the batch files to a pre-set number of parallel running Stata instances. multishell has several distinct features. If do files include forvalues and foreach loops, multishell dissects the loops and creates for each combination a new do file, which is added to the queue. This allows for an efficient allocation and use of processor power. multishell can be used to connect two or more computers to a cluster. multishell then allocates to each computer parts of the queue and a simulation is run parallel on multiple computers. Computational power is used efficiently and time saved.
    Date: 2018–10–15
  4. By: Bizimana, Jean-Claude; Richardson, James W.
    Abstract: The rural population in developing countries depends on agriculture. However, in many of these countries, agricultural productivity remains low with episodes of famines in drought-prone areas. One of the options to increase agricultural productivity is through adoption and use of improved agricultural technologies and management systems. Being a relatively high risk business due to factors related to production, marketing and finance, agriculture requires to devise risk mitigating strategies. Several models used to evaluate the adoption of agricultural technologies focus mainly on assessing the ex-post impact of technology without necessarily quantifying the profit and risk associated with the adoption of technologies. This paper introduces a farm simulation model (FARMSIM) that attempts to evaluate the potential economic and nutritional impacts of new agricultural technologies before they are adopted (ex-ante). FARMSIM is a Monte Carlo simulation model that simultaneously evaluates a baseline and an alternative farming technology. In this study, the model is used to analyze the impact of adoption of small scale irrigation technologies and fertilizers on the farm income and nutrition of smallholder farmers in Robit kebele, Amhara region of Ethiopia. The farming technologies under study comprise water lifting technologies (pulley and tank, rope and washer pump, gasoline/diesel motor pump and a solar pump) and use of fertilizers. The key output variables (KOVs) are the probability of positive annual net cash income and ending cash reserves, probability of positive net present value and a benefit cost ratio greater than one. For nutrition, the KOVs relate to the probability of consumption exceeding average daily minimum requirements of an adult for calories, protein, fat, calcium, iron, and vitamin A. The application of recommended fertilizers on grain and vegetable crops, alongside the use of irrigation to grow vegetables and fodder using a motor pump had the highest net present value compared to other scenarios. Similar results were observed for the net cash farm income and the ending cash reserves. However, the most feasible and profitable scenario is the one under the pulley system which had the highest benefit cost ratio. Solar pump system had the lowest benefit cost ratio due most likely to high initial investment cost. As for the nutrition, the simulation results show an increase in quantities available to the farm family of all nutrition variables under all alternative scenarios. However, the daily minimum requirements per adult equivalent were met only for calories, proteins, iron and vitamin A but deficiencies were observed for fat and calcium.
    Keywords: Agricultural and Food Policy, International Development, Risk and Uncertainty
    Date: 2018–01–17
  5. By: Mattia Guerini (Scuola Superiore Sant'Anna); Mauro Napoletano (Observatoire français des conjonctures économiques); Andrea Roventini (Laboratory of Economics and Management (LEM))
    Abstract: We develop an agent-based model in which heterogeneous firms and households interact in labor and good markets according to centralized or decentralized search and matching protocols. As the model has a deterministic backbone and a full-employment equilibrium, it can be directly compared to Dynamic Stochastic General Equilibrium (DSGE) models. We study the effects of negative productivity shocks by way of impulse-response functions (IRF). Simulation results show that when search and matching are centralized, the economy is always able to return to the full employment equilibrium and IRFs are similar to those generated by DSGE models. However, when search and matching are local, coordination failures emerge and the economy persistently deviates from full employment. Moreover, agents display persistent heterogeneity. Our results suggest that macroeconomic models should explicitly account for agents’ heterogeneity and direct interactions.
    Keywords: Agent based model; Local interactions; Heterogeneous agents; DGSE Model
    JEL: E32 E37
    Date: 2018–01
  6. By: Bielecki, Marcin; Stähler, Nikolai
    Abstract: We use a New Keynesian DSGE model with search frictions on the housing market to evaluate how financing a labor tax reduction by higher property taxation affects the real economy and welfare. Search on the housing market enables us to explicitly model stocks and flows, which is necessary to differentiate between recurrent property taxes (levied on stocks) and property transaction taxes (levied to flows). We find that using recurrent property taxation as financing instrument outperforms other instruments although all policy measures increase aggregate economy-wide welfare. Our simulations suggest that using property transaction taxation as financing instrument is the least favorable measure.
    Keywords: Search Frictions in Housing Markets,Property Taxation,Tax Reform,General Equilibrium
    JEL: E51 E6 R31 K34
    Date: 2018
  7. By: Prakhar Ganesh; Puneet Rakheja
    Abstract: The ability to give a precise and fast prediction for the price movement of stocks is the key to profitability in High Frequency Trading. The main objective of this paper is to propose a novel way of modeling the high frequency trading problem using Deep Reinforcement Learning and to argue why Deep RL can have a lot of potential in the field of High Frequency Trading. We have analyzed the model's performance based on it's prediction accuracy as well as prediction speed across full-day trading simulations.
    Date: 2018–09
  8. By: Jeonggyu Huh
    Abstract: In this paper, we measure systematic risk with a new nonparametric factor model, the neural network factor model. The suitable factors for systematic risk can be naturally found by inserting daily returns on a wide range of assets into the bottleneck network. The network-based model does not stick to a probabilistic structure unlike parametric factor models, and it does not need feature engineering because it selects notable features by itself. In addition, we compare performance between our model and the existing models using 20-year data of S&P 100 components. Although the new model can not outperform the best ones among the parametric factor models due to limitations of the variational inference, the estimation method used for this study, it is still noteworthy in that it achieves the performance as best the comparable models could without any prior knowledge.
    Date: 2018–09
  9. By: Anastasis Kratsios; Cody B. Hyndman
    Abstract: We introduce a meta-algorithm, called non-Euclidean upgrading (NEU), which learns algorithm-specific geometries to improve the training and validation set performance of a wide class of learning algorithms. Our approach is based on iteratively performing local reconfigurations of the space in which the data lie. These reconfigurations build universal approximation and universal reconfiguration properties into the new algorithm being learned. This allows any set of features to be learned by the new algorithm to arbitrary precision. The training and validation set performance of NEU is investigated through implementations predicting the relationship between select stock prices as well as finding low-dimensional representations of the German Bond yield curve.
    Date: 2018–08
  10. By: Rusiana, Hofner D.; Escalante, Cesar L.
    Keywords: Agribusiness, Production Economics, Labor and Human Capital
    Date: 2017–06–30
  11. By: Saikai, Yuji; Mitchell, Paul D.
    Keywords: Agricultural and Food Policy, Production Economics, Agribusiness
    Date: 2017–06–30
  12. By: Ifft, Jennifer E.; Kuhns, Ryan; Patrick, Kevin T.
    Keywords: Agricultural Finance, Agribusiness, Agricultural and Food Policy
    Date: 2017–06–15
  13. By: Crane-Droesch, Andrew
    Keywords: Research Methods/Statistical Methods, Land Economics/Use, Productivity Analysis
    Date: 2017–06–15
  14. By: J. Meekes; W.H.J. Hassink
    Abstract: In this article, we introduce the Stata implementation of a flow-based cluster algorithm written in Mata. The main purpose of the flowbca command is to identify clusters based on relational data of flows. We illustrate the command by providing multiple applications, from the research fields of economic geography, industrial input-output analysis, and social network analysis.
    Keywords: clusters, aggregation, flows
    Date: 2017–07
  15. By: Dave Cliff
    Abstract: This paper describes the design, implementation, and successful use of the Bristol Stock Exchange (BSE), a novel minimal simulation of a centralised financial market, based on a Limit Order Book (LOB) such as is common in major stock exchanges. Construction of BSE was motivated by the fact that most of the world's major financial markets have automated, with trading activity that previously was the responsibility of human traders now being performed by high-speed autonomous automated trading systems. Research aimed at understanding the dynamics of this new style of financial market is hampered by the fact that no operational real-world exchange is ever likely to allow experimental probing of that market while it is open and running live, forcing researchers to work primarily from time-series of past trading data. Similarly, university-level education of the engineers who can create next-generation automated trading systems requires that they have hands-on learning experience in a sufficiently realistic teaching environment. BSE as described here addresses both those needs: it has been successfully used for teaching and research in a leading UK university since 2012, and the BSE program code is freely available as open-source on GitHuB.
    Date: 2018–09
  16. By: Jorge Faleiro
    Abstract: This paper introduces Sigma, a domain-specific computational representation for collaboration in large-scale for the field of economics. A computational representation is not a programming language or a software platform. A computational representation is a domain-specific representation system based on three specific elements: facets, contributions, and constraints of data. Facets are definable aspects that make up a subject or an object. Contributions are shareable and formal evidence, carrying specific properties, and produced as a result of a crowd-based scientific investigation. Constraints of data are restrictions defining domain-specific rules of association between entities and relationships. A computational representation serves as a layer of abstraction that is required in order to define domain-specific concepts in computers, in a way these concepts can be shared in a crowd for the purposes of a controlled scientific investigation in large-scale by crowds. Facets, contributions, and constraints of data are defined for any domain of knowledge by the application of a generic set of inputs, procedural steps, and products called a representational process. The application of this generic process to our domain of knowledge, the field of economics, produces Sigma. Sigma is described in this paper in terms of its three elements: facets (streaming, reactives, distribution, and simulation), contributions (financial models, processors, and endpoints), and constraints of data (configuration, execution, and simulation meta-model). Each element of the generic representational process and the Sigma computational representation is described and formalized in details.
    Date: 2018–09

This nep-cmp issue is ©2018 by Stan Miles. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.