nep-cmp New Economics Papers
on Computational Economics
Issue of 2021‒06‒14
nineteen papers chosen by

  1. Forecasting UK GDP growth with large survey panels By Anesti, Nikoleta; Kalamara, Eleni; Kapetanios, George
  2. Metamodeling: A useful tool for applying innovative simulation techniques in agricultural economics By Jin, Ding
  3. Forecasting CPI Inflation Components with Hierarchical Recurrent Neural Network By Oren Barkan; Jonathan Benchimol; Itamar Caspi; Allon Hammer; Noam Koenigstein
  4. Economic impacts of decarbonizing the Swiss passenger transport sector By Vanessa Angst; Chiara Colesanti Senni; Markus Maibach; Martin Peter; Noe Reidt; Renger van Nieuwkoop
  5. A Sentiment-based Risk Indicator for the Mexican Financial Sector By Caterina Rho; Raúl Fernández; Brenda Palma
  6. Monitoring War Destruction from Space Using Machine Learning By Hannes Mueller; André Groeger; Jonathan Hersh; Andrea Matranga; Joan Serrat
  7. Urban Economics in a Historical Perspective: Recovering Data with Machine Learning By Combes, Pierre-Philippe; Gobillon, Laurent; Zylberberg, Yanos
  8. Artificial Intelligence, Ethics, and Intergenerational Responsibility By Victor Klockmann; Alicia von Schenk; Marie Claire Villeval
  9. Generating a classification for EUIPO trademark filings: A string matching approach By Neuhäusler, Peter; Feidenheimer, Alexander; Frietsch, Rainer; Kroll, Henning
  10. An Interpretable Neural Network for Parameter Inference By Johann Pfitzinger
  11. GARCHNet - Value-at-Risk forecasting with novel approach to GARCH models based on neural networks By Mateusz Buczyński; Marcin Chlebus
  12. México | Patrones de consumo de efectivo vs. tarjeta: una aproximación Big Data By Saide Aránzazu Salazar; Jaime Oliver Huidobro; Alvaro Ortiz; Tomasa Rodrigo; Ignacio Tamarit
  13. German Pension Simulation: Arbeitspapier zur Methodik eines anwartschaftsbasierten Projektionsmodells der gesetzlichen Rentenversicherung By Seuffert, Stefan
  14. Confidence in public institutions is critical in containing the COVID-19 pandemic By Adamecz-Völgyi, Anna; Szabó-Morvai, Ágnes
  15. Image Inpainting via Generative Multi-column with the aid of Deep Convolutional Neural Networks By Rajesh B; Muralidhara B L
  16. WP 04-21 - Réformes régionales des allocations familiales – Une analyse d’impact avec le modèle de microsimulation EXPEDITION By Hendrik Nevejan; Guy Van Camp; Dieter Vandelannoote
  17. Simulating Family Life Courses: An Application for Italy, Great Britain, and Scandinavia By Maria Winkler-Dworak; Eva Beaujouan; Paola Di Giulio; Martin Spielauer
  18. Coping with seasonality in a quarterly CGE model: COVID-19 and U.S. agriculture By Peter B. Dixon; Maureen T. Rimmer
  19. The Creation and Diffusion of Knowledge - an Agent Based Modelling Approach By Emmanuel P. de Albuquerque

  1. By: Anesti, Nikoleta (Bank of England); Kalamara, Eleni (King’s College London); Kapetanios, George (Bank of England)
    Abstract: By employing large panels of survey data for the UK economy, we aim at reviewing linear approaches for regularisation and dimension reduction combined with techniques from the machine learning literature, like Random Forests, Support Vector Regressions and Neural Networks for forecasting GDP growth at monthly frequency for horizons from one month up to two years ahead. We compare the predictive content of surveys with text based indicators from newspaper articles and a standard macroeconomic data set and extend the empirical evidence on the contribution of survey data against text indicators and more traditional macroeconomic time series in predicting economic activity. Among the linear models, the Ridge and the Partial Least Squares models report the largest gains consistently for most of the forecasting horizons, and for the non‑linear machine learning models, the SVR performs better at shorter horizons compared to the Neural Networks and Random Forest that seem to be more appropriate for longer‑term forecasting. Text based indicators appear to favour more the use of non‑linear models and the expansion of the information set with macroeconomic time series does not appear to add much more predictive power. The largest forecasting gains are overwhelmingly concentrated at the shorter horizons for the majority of models and datasets which provides further empirical support that non‑linear machine learning models appear to be more useful during the Great Recession.
    Keywords: Forecasting; survey data; text indicators; machine learning
    JEL: C53 C55
    Date: 2021–05–28
  2. By: Jin, Ding
    Abstract: Computational simulation models are widely used in the field of agricultural economics for a variety of tasks, particularly for evidence-based policy analysis. Despite the substantial and continuing growth of computing power and speed, the growing complexity together with the implicit nature of the simulation models, on the one hand, still lead to high computational costs in applying the models along with great difficulties in the parameter specification where data availability and parametrization constraints for empirical calibration problems are notably challenging. On the other hand, they also limit the use of simulation models in many other aspects such as integration into other research frameworks like policy optimization coupled with uncertainty analysis. In this paper, we attempt to systematically and comprehensively introduce the metamodeling technique and investigate several metamodel types in terms of accuracy, computational time, variable importance, and potential practical applications.
    Keywords: Computational simulation models,metamodeling,policy analysis,DoE
    JEL: D58 C68 O13 Q11 I3 O21 G11
    Date: 2021
  3. By: Oren Barkan (Ariel University); Jonathan Benchimol (Bank of Israel); Itamar Caspi (Bank of Israel); Allon Hammer (Tel-Aviv University); Noam Koenigstein (Tel-Aviv University)
    Abstract: We present a hierarchical architecture based on Recurrent Neural Networks (RNNs) for predicting disaggregated inflation components of the Consumer Price Index (CPI). While the majority of existing research is focused on predicting headline inflation, many economic and financial institutions are interested in its partial disaggregated components. To this end, we developed the novel Hierarchical Recurrent Neural Network (HRNN) model, which utilizes information from higher levels in the CPI hierarchy to improve predictions at the more volatile lower levels. Based on a large dataset from the US CPI-U index, our evaluations indicate that the HRNN model significantly outperforms a vast array of well-known inflation prediction baselines. Our methodology and results provide additional forecasting measures and possibilities to policy and market makers on sectoral and component-specific prices.
    Keywords: Inflation forecasting, Disaggregated inflation, Consumer Price Index, Machine learning, Gated Recurrent Unit, Recurrent Neural Networks
    JEL: C45 C53 E31 E37
    Date: 2021–03
  4. By: Vanessa Angst (Infras AG); Chiara Colesanti Senni (Council on Economic Policies); Markus Maibach (Infras AG); Martin Peter (Infras AG); Noe Reidt (CER–ETH – Center of Economic Research at ETH Zurich, Switzerland); Renger van Nieuwkoop (Modelworks)
    Abstract: Switzerland committed to achieving net-zero emissions in 2050. This goal is particularly ambitious for the Swiss passenger transport system, which emits more than one third of Swiss CO2 emissions, and is not yet on a clear emission reduction path. We investigate the economic impact and the emission-saving potential of a decarbonization pathway for the Swiss transport sector based on three edge case scenarios and on a combination of them: (1) improved fuel/engine technology and fostered diffusion of battery electric vehicle, (2) increased capacity use of passenger cars, and (3) enhanced modal shift towards public transport. Our analysis is conducted using a multi-model framework, which interlinks a computational general equilibrium model with two external transportation models. This approach allows us to incorporate a highly disaggregated passenger transport system into the economic analysis. The framework is calibrated to Swiss data to assess the optimal scenario mix in terms of emissions and economic impact. The optimal decarbonization pathway mix slightly increases welfare and lowers CO2 emissions of passenger transport in 2050 from 6 to 1.7 million tons CO2 compared to the reference scenario. Despite the sharp reduction in emissions, a decarbonization pathway based on the considered scenarios is insufficient to reach the net-zero emission target.
    Keywords: Passenger transport, Decarbonization, Switzerland, Computable general equilibrium model
    JEL: C68 R40 R42 R48
    Date: 2021–05
  5. By: Caterina Rho; Raúl Fernández; Brenda Palma
    Abstract: We apply text analysis to Twitter messages in Spanish to build a sentiment- based risk index for the financial sector in Mexico. We classify a sample of tweets for the period 2006-2019 to identify messages in response to positive or negative shocks to the Mexican financial sector. We use a voting classifier to aggregate three different classifiers: one based on word polarities from a pre-defined dictionary; one based on a support vector machine; and one based on neural networks. Next, we compare our Twitter sentiment index with existing indicators of financial stress. We find that this novel index captures the impact of sources of financial stress not explicitly encompassed in quantitative risk measures. Finally, we show that a shock in our Twitter sentiment index correlates positively with an increase in financial market risk, stock market volatility, sovereign risk, and foreign exchange rate volatility.
    JEL: G1 G21 G41
    Date: 2021–05
  6. By: Hannes Mueller; André Groeger; Jonathan Hersh; Andrea Matranga; Joan Serrat
    Abstract: Satellite imagery is becoming ubiquitous and is released with ever higher frequency. Research has demonstrated that Artificial Intelligence (AI) applied to satellite imagery holds promise for automated detection of war-related building destruction. While these results are promising, monitoring in real-world applications requires consistently high precision, especially when destruction is sparse and detecting destroyed buildings is equivalent to looking for a needle in a haystack. We demonstrate that exploiting the persistent nature of building destruction can substantially improve the training of automated destruction monitoring. We also propose an additional machine learning stage that leverages images of surrounding areas and multiple successive images of the same area which further improves detection significantly. By combining these steps, we construct an automated classification of building destruction which allows real-world applications and we illustrate this in the context of the Syrian civil war.
    Keywords: conflict, destruction, deep learning, remote sensing, Syria
    JEL: C45 C23 D74
    Date: 2021–05
  7. By: Combes, Pierre-Philippe (GATE, University of Lyon); Gobillon, Laurent (Paris School of Economics); Zylberberg, Yanos (University of Bristol)
    Abstract: A recent literature has used a historical perspective to better understand fundamental questions of urban economics. However, a wide range of historical documents of exceptional quality remain underutilised: their use has been hampered by their original format or by the massive amount of information to be recovered. In this paper, we describe how and when the flexibility and predictive power of machine learning can help researchers exploit the potential of these historical documents. We first discuss how important questions of urban economics rely on the analysis of historical data sources and the challenges associated with transcription and harmonisation of such data. We then explain how machine learning approaches may address some of these challenges and we discuss possible applications.
    Keywords: machine learning, history, urban economics
    JEL: R11 R12 R14 N90 C45 C81
    Date: 2021–05
  8. By: Victor Klockmann (Goethe University Frankfurt, Theodor-W.-Adorno-Platz 4, 60323 Frankfurt, Germany. Center for Humans & Machines, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany); Alicia von Schenk (Goethe University Frankfurt, Theodor-W.-Adorno-Platz 4, 60323 Frankfurt, Germany. Center for Humans & Machines, Max Planck Institute for Human Development, Lentzeallee 94, 14195 Berlin, Germany.); Marie Claire Villeval (Univ Lyon, CNRS, GATE UMR 5824, 93 Chemin des Mouilles, F-69130, Ecully, France. IZA, Bonn, Germany)
    Abstract: With Big Data, decisions made by machine learning algorithms depend on training data generated by many individuals. In an experiment, we identify the effect of varying individual responsibility for moral choices of an artificially intelligent algorithm. Across treatments, we manipulated the sources of training data and thus the impact of each individual’s decisions on the algorithm. Reducing or diffusing pivotality for algorithmic choices increased the share of selfish decisions. Once the generated training data exclusively affected others’ payoffs, individuals opted for more egalitarian payoff allocations. These results suggest that Big Data offers a “moral wiggle room” for selfish behavior.
    Keywords: Artificial Intelligence, Pivotality, Ethics, Externalities, Experiment
    JEL: C49 C91 D10 D63 D64 O33
    Date: 2021
  9. By: Neuhäusler, Peter; Feidenheimer, Alexander; Frietsch, Rainer; Kroll, Henning
    Abstract: This paper aims to analyze topics within the international classification of goods and services (NICE classes) applied for the registration of trademarks at the EUIPO. This is accomplished by introducing a more fine-grained classification of trademarks as a "sub-section" of the rather rough NICE classes. To do this, we relate the descriptions of the trademarks that the applicants provide upon filing to the list of pre-defined keywords that are available from the WIPO to assist the applicant in describing his or her mark. In order to relate the keywords to the classifications, i.e. to assign trademarks to the classification, we use two algorithms including a Levenshtein-based matching and a Jaro-Winkler al-gorithm based matching. The Levenshtein-based approach already leads to a coverage of 75% of matched trademarks. With the help of the Jaro-Winkler matching algorithm (in combination with the Levenshtein distance) we could assign another 10%, leading to a coverage of 85% of all EUIPO trademarks matched to at least one classification key in 2018. Based on this matching we generate a hierarchical classification including five layers, the first layer including 234 classes up to the 5th layer which comprises 8,613 distinct classes.
    Date: 2021
  10. By: Johann Pfitzinger
    Abstract: Adoption of deep neural networks in fields such as economics or finance has been constrained by the lack of interpretability of model outcomes. This paper proposes a generative neural network architecture - the parameter encoder neural network (PENN) - capable of estimating local posterior distributions for the parameters of a regression model. The parameters fully explain predictions in terms of the inputs and permit visualization, interpretation and inference in the presence of complex heterogeneous effects and feature dependencies. The use of Bayesian inference techniques offers an intuitive mechanism to regularize local parameter estimates towards a stable solution, and to reduce noise-fitting in settings of limited data availability. The proposed neural network is particularly well-suited to applications in economics and finance, where parameter inference plays an important role. An application to an asset pricing problem demonstrates how the PENN can be used to explore nonlinear risk dynamics in financial markets, and to compare empirical nonlinear effects to behavior posited by financial theory.
    Date: 2021–06
  11. By: Mateusz Buczyński (Interdisciplinary Doctoral School, University of Warsaw); Marcin Chlebus (Faculty of Economic Sciences, University of Warsaw)
    Abstract: This study proposes a new GARCH specification, adapting a long short-term memory (LSTM) neural network's architecture. Classical GARCH models have been proven to give substantially good results in the case of financial modeling, where high volatility can be observed. In particular, their high value is often praised in the case of Value-at-Risk. However, the lack of nonlinear structure in most of the approaches entails that the conditional variance is not represented in the model well enough. On the contrary, recent rapid advancement of deep learning methods is said to be capable of describing any nonlinear relationships prominently. We suggest GARCHNet - a nonlinear approach to conditional variance that combines LSTM neural networks with maximum likelihood estimators of probability in GARCH. The distributions of the innovations considered in the paper are: normal, t and skewed t, however the approach does enable extensions to other distributions as well. To evaluate our model, we have executed an empirical study on the log returns of WIG 20 (Warsaw Stock Exchange Index) in four different time periods throughout 2005 and 2021 with varying levels of observed volatility. Our findings confirm the validity of the solution, however we present several directions to develop it further.
    Keywords: Value-at-Risk, GARCH, neural networks, LSTM
    JEL: G32 C52 C53 C58
    Date: 2021
  12. By: Saide Aránzazu Salazar; Jaime Oliver Huidobro; Alvaro Ortiz; Tomasa Rodrigo; Ignacio Tamarit
    Abstract: El documento propone una nueva metodología que combina datos de operaciones con tarjeta e información de operaciones en efectivo en supermercados. Se estudian los cambios en patrones de consumo en relación con las variaciones de ingresos, que incluye la evolución del consumo de bienes y el uso de distintos canales de pago. This paper proposes a novel methodology combining high frequency card transaction data and point-of-sale (POS) data from cash operations registered at convenience stores to study changes in consumption patterns relative to variations in income, including changes in the items consumed and the payment channel.
    Keywords: e-Payments, Pagos electrónicos, cash, efectivo, Big Data, Big Data, machine learning, aprendizaje automático, consumption patterns, patrones de consumo, Mexico, México, Global, Global, Analysis with Big Data, Análisis con Big Data, Working Papers, Documento de Trabajo
    JEL: C32 D12 O17 O54
    Date: 2021–05
  13. By: Seuffert, Stefan
    Abstract: Die Debatte um die Rentenpolitik ist ein zentraler Bestandteil des politischen Diskurses in Deutschland und kommt nicht zur Ruhe. Eine Abschätzung der Auswirkungen der in diesem Rahmen vorgeschlagenen Reformen auf die zukünftige Einnahmen- und Ausgabenstruktur der Rentenversicherung kann anhand von Rentensimulationsmodellen geschehen. Die vorliegende Arbeit erläutert das konkrete methodische Vorgehen zur Simulation der Allgemeinen Rentenversicherung im Rahmen des vorgestellten Modells "German Pension Simulation"(GPS). Ziel der Rentensimulation ist insbesondere die Abschätzung der zukünftigen Einnahmen und Ausgaben sowie des zukünftigen Leistungs- und Beitragssatzniveaus der Allgemeinen Rentenversicherung. Die Projektion basiert unter anderem auf einer Bevölkerungs- und Arbeitsmarktprojektion, einer einfachen Lohnprojektion sowie einer Fortschreibung der aktuellen altersspezifischen Rentenansprüche der Versicherten und Rentner.
    Keywords: Rentenprojektion,gesetzliche Rentenversicherung,Beitragssatz,Rentenniveau,Rentenwert
    JEL: C53 H55 H68 J11
    Date: 2020
  14. By: Adamecz-Völgyi, Anna; Szabó-Morvai, Ágnes
    Abstract: This paper investigates the relative importance of confidence in public institutions to explain cross-country differences in the severity of the COVID-19 pandemic. We extend the related literature by employing regression and machine learning methods to identify the most critical predictors of deaths attributed to the pandemic. We find that a one standard deviation increase (e.g., the actual difference between the US and Finland) in confidence is associated with 350.9 fewer predicted deaths per million inhabitants. Confidence in public institutions is one of the most important predictors of deaths attributed to COVID-19, compared to country-level measures of individual health risks, the health system, demographics, economic and political development, and social capital. Our results suggest that effective policy implementation requires citizens to cooperate with their governments, and willingness to cooperate relies on confidence in public institutions.
    Keywords: COVID-19,death rate,confidence in public institutions,machine learning
    JEL: I18 P16
    Date: 2021
  15. By: Rajesh B (BASE University, Bengaluru); Muralidhara B L (Department of Computer Science & Application, Bangalore University, Bengaluru)
    Abstract: Images can be described as visual representations or likeness of something (person or object or a scanned document) which can be reproduced or captured, e.g. a hand drawing, photographic material. The advent of the digital age has seen the rapid shift image storage technologies, from hard-copies to digitalized units in a less burdensome manner with the application of digital tools. The research aims to design a confidence-driven reconstruction loss while an implicit diversified Markov Random Field (MRF) regularization is adopted to enhance local details. The multi-column network combined with the reconstruction and MRF loss propagates local and global information derived from context to the target inpainting regions. Extensive experiments on challenging street view, face, natural objects and scenes manifest that our proposed method produces visual compelling results even without previously common post-processing. The research involves pre-trained Deep Convolutional Neural Network (DCNN) and their training networks like ResNet50, GoogleNet, AlexNet and VGG-16. The average PSNR performance of the proposed model is 24.64db and Structural Similarity Index Measure (SSIM) is 0.9018.
    Keywords: Markov Random Field, Deep Convolutional Neural Network, ResNet50, GoogleNet, AlexNet, VGG-16, Structural Similarity Index Measure.
    Date: 2021–05
  16. By: Hendrik Nevejan; Guy Van Camp; Dieter Vandelannoote
    Abstract: This Working Paper puts the policy choices made in the regional child benefit reforms into perspective. Using the microsimulation model EXPEDITION, the expected direct effects of these reforms on child benefit expenditure and income distribution are mapped out. Special attention is paid to the effects on the simulated poverty risk of (families with) children, as this was a shared concern during the reforms in the different regions.
    Keywords: Microsimulation, Child benefit, Income distribution, Poverty risk
    JEL: C63 D31 H55 I32 P46
    Date: 2021–05–18
  17. By: Maria Winkler-Dworak; Eva Beaujouan; Paola Di Giulio; Martin Spielauer
    Abstract: Family patterns in Western countries have substantially changed across the 1940 to 1990 birth cohorts. Adults born more recently enter more often unmarried cohabitations and marry later, if at all. They have children later and fewer of them; births take place in a non-marital union more often and, due to the declining stability of couple relationships, in more than one partnership. These changes have led to an increasing diversity in family life courses. In this paper, we present a microsimulation model of family life trajectories, which models the changing family patterns taking into account the complex interrelationships between childbearing and partnership processes. The microsimulation model is parameterized to retrospective data for women born since 1940 in Italy, Great Britain and two Nordic countries (Norway and Sweden), representing three significantly different cultural and institutional contexts of partnering and childbearing in Europe. Validation of the simulated family life courses against their real-world equivalents shows that the simulations not only closely replicate observed childbearing and partnership processes, but also give good predictions when compared to more recent fertility indicators. We conclude that the presented microsimulation model is suitable for exploring changing family dynamics and outline potential research questions and further applications.
    Keywords: Family life course, fertility, partnerships, microsimulation, Italy, Great Britain, Norway, Sweden
    Date: 2019–11
  18. By: Peter B. Dixon; Maureen T. Rimmer
    Abstract: Most dynamic CGE models work with periods of one year. This limits their applicability for analyzing the effects of shocks that operate over a short period or with different intensities through a year. It is relatively easy to convert an annual CGE model to shorter periodicity, for example a quarter, if we ignore seasonal differences in the pattern of economic activity. But this is not acceptable for agriculture. This paper introduces seasonal factors to the agricultural specification in a detailed quarterly CGE model of the U.S. The model is then applied to analyze the effects of the COVID pandemic on U.S. farm industries. Taking account of the general features of the pandemic such as the reduction in household spending, we find that these effects are mild relative to the effects on most other industries. However, agriculture is subject to potential supply-chain disruptions. We apply our quarterly model to analyze two such possibilities: loss of labour at harvest time in Fruit & nut farms; and temporary closure of meat-processing plants. We find that these disruptions are unlikely to cause noticeable reductions in the supply of food products to U.S. households.
    Keywords: Quarterly CGE modelling seasonal factors in agriculture COVID pandemic supply-chain disruption U S agriculture
    JEL: C68 Q11 I19
    Date: 2021–01
  19. By: Emmanuel P. de Albuquerque
    Abstract: In this paper I propose a novel abstract mechanism for the creation and diffusion of knowledge and use an agent based modelling approach to explore it. The mechanism takes into account the relation between the phenomena that agents attempt to explain and the stocks of knowledge available in a society, be it individually or collectively. I find that the aggregate number of knowledge units in a society increases more slowly, the more naive its inhabitants are. I also find that the proximity between phenomena plays an important role in how often the same knowledge unit can be used. A discussion on agent based models as a means of insight into society is offered.
    Keywords: Agent-based modelling; Cognitive distance; Exploitation; Exploration; Innovation; Knowledge creation; Knowledge diffusion; Learning
    JEL: B52 C63 D83 O33
    Date: 2021–05

General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.