nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒10‒29
fifteen papers chosen by



  1. Tracking economic growth by evolving expectations via genetic programming: A two-step approach By Oscar Claveria; Enric Monte; Salvador Torra
  2. Modified Genetic Algorithm Framework for Optimal Scheduling of Single Microgrid Combination with Distribution System Operator By Jamaledini, Ashkan; Khazaei, Ehsan; Toran, Mehdi
  3. Credit Risk Analysis Using Machine and Deep Learning Models By Dominique Guegan; Peter Addo; Bertrand Hassani
  4. Reinforcement learning in financial markets - a survey By Fischer, Thomas G.
  5. Does Mexico Benefit from the Clean Development Mechanism? A Macroeconomic and Environmental General Equilibrium Analysis By Jean-Marc Montaud; Nicolas Pecastaing
  6. Simulating Mine Revenues with Historical Gold Price Data from the Bank of England By Bell, Peter
  7. The potential cost of a Failed Doha Round By Antoine Bouët; David Laborde
  8. Can preferential trade agreements enhance renewable electricity generation in emerging economies? A model-based policy analysis for Brazil and the European Union By Yadira Mori-Clement; Stefan Nabernegg; Birgit Bednar-Friedl
  9. Can Education Compensate the Effect of Population Aging On Macroeconomic Performance? By Kotschy, Rainer; Sunde, Uwe
  10. Reliably Computing Nonlinear Dynamic Stochastic Model Solutions: An Algorithm with Error Formulas By Gary S. Anderson
  11. Different automated valuation modelling techniques evaluated over time. By Michael Mayer; Steven C. Bourassa; Martin Hoesli; Donato Scognamiglio
  12. Exploitation, skills, and inequality By Jonathan Cogliano; Roberto Veneziani; Naoki Yoshihara
  13. Forecasting financial crashes with quantum computing By Roman Orus; Samuel Mugel; Enrique Lizaso
  14. Multivariate stochastic volatility with co-heteroscedasticity By Joshua Chan; Arnaud Doucet; Roberto León-González; Rodney W. Strachan
  15. Optimal fiscal policy with Epstein-Zin preferences and utility-enhancing government services: lessons from Bulgaria (1999-2016) By Vasilev, Aleksandar

  1. By: Oscar Claveria (AQR-IREA, Department of Econometrics, Statistics and Applied Economics, Universitat de Barcelona); Enric Monte (Department of Signal Theory and Communications, Polytechnic University of Catalunya (UPC)); Salvador Torra (RISKCENTER, IREA, Department of Econometrics, Statistics and Applied Economics, Universitat de Barcelona)
    Abstract: The main objective of this study is to present a two-step approach to generate estimates of economic growth based on agents’ expectations from tendency surveys. First, we design a genetic programming experiment to derive mathematical functional forms that approximate the target variable by combining survey data on expectations about different economic variables. We use evolutionary algorithms to estimate a symbolic regression that links survey-based expectations to a quantitative variable used as a yardstick (economic growth). In a second step, this set of empirically-generated proxies of economic growth are linearly combined to track the evolution of GDP. To evaluate the forecasting performance of the generated estimates of GDP, we use them to assess the impact of the 2008 financial crisis on the accuracy of agents' expectations about the evolution of the economic activity in 28 countries of the OECD. While in most economies we find an improvement in the capacity of agents' to anticipate the evolution of GDP after the crisis, predictive accuracy worsens in relation to the period prior to the crisis. The most accurate GDP forecasts are obtained for Sweden, Austria and Finland.
    Keywords: Evolutionary algorithms; Symbolic regression; Genetic programming; Business and consumer surveys; Expectations; Forecasting.
    JEL: C51 C63 C83 C93
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:xrp:wpaper:xreap2018-4&r=cmp
  2. By: Jamaledini, Ashkan; Khazaei, Ehsan; Toran, Mehdi
    Abstract: In this paper, new reformed genetic algorithm (GA) according to the multicellular organism mechanism is developed for power management of single microgrid incorporation with the distribution system operator (DSO). Integration of single microgrid into the conventional grids cann enhance the complexity of the problem due to ability of disconnecting from the main grid as a standalone small electricity network. Hence, in this paper, a new evolutionary algorithm is developed to address the complexity of the problem. The main objective of the proposed model is to minimize the total operation cost of the microgrid in both utility connected and off utility connected modes; that means the objective is based on the economic consideration. To demonstrates the high performance and ability of the proposed method, a modified IEEE 33 distribution bus test network is selected and examined. Finally, the results are compared with the famous evolutionary algorithms such as particle swarm optimization (PSO). In it worth noting that the results are only based on the economic consideration.
    Keywords: Optimal energy management; Microgrid; Economic consideration
    JEL: L00 L59
    Date: 2018–10–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:89411&r=cmp
  3. By: Dominique Guegan (UP1 - Université Panthéon-Sorbonne, CES - Centre d'économie de la Sorbonne - CNRS - Centre National de la Recherche Scientifique - UP1 - Université Panthéon-Sorbonne, Labex ReFi - UP1 - Université Panthéon-Sorbonne, IPAG Business School - IPAG BUSINESS SCHOOL PARIS, University of Ca’ Foscari [Venice, Italy]); Peter Addo (AFD - Agence française de développement, Labex ReFi - UP1 - Université Panthéon-Sorbonne); Bertrand Hassani (Labex ReFi - UP1 - Université Panthéon-Sorbonne, CES - Centre d'économie de la Sorbonne - CNRS - Centre National de la Recherche Scientifique - UP1 - Université Panthéon-Sorbonne, Capgemini Consulting [Paris], UCL-CS - Computer science department [University College London] - UCL - University College of London [London])
    Abstract: Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modeling process to test the stability of binary classifiers by comparing their performance on separate data. We observe that the tree-based models are more stable than the models based on multilayer artificial neural networks. This opens several questions relative to the intensive use of deep learning systems in enterprises.
    Keywords: financial regulation,deep learning,Big data,data science,credit risk
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-01835164&r=cmp
  4. By: Fischer, Thomas G.
    Abstract: The advent of reinforcement learning (RL) in financial markets is driven by several advantages inherent to this field of artificial intelligence. In particular, RL allows to combine the "prediction" and the "portfolio construction" task in one integrated step, thereby closely aligning the machine learning problem with the objectives of the investor. At the same time, important constraints, such as transaction costs, market liquidity, and the investor's degree of risk-aversion, can be conveniently taken into account. Over the past two decades, and albeit most attention still being devoted to supervised learning methods, the RL research community has made considerable advances in the finance domain. The present paper draws insights from almost 50 publications, and categorizes them into three main approaches, i.e., critic-only approach, actor-only approach, and actor-critic approach. Within each of these categories, the respective contributions are summarized and reviewed along the representation of the state, the applied reward function, and the action space of the agent. This cross-sectional perspective allows us to identify recurring design decisions as well as potential levers to improve the agent's performance. Finally, the individual strengths and weaknesses of each approach are discussed, and directions for future research are pointed out.
    Keywords: financial markets,reinforcement learning,survey,trading systems,machine learning
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:zbw:iwqwdp:122018&r=cmp
  5. By: Jean-Marc Montaud (CATT - Centre d'Analyse Théorique et de Traitement des données économiques - UPPA - Université de Pau et des Pays de l'Adour); Nicolas Pecastaing (CATT - Centre d'Analyse Théorique et de Traitement des données économiques - UPPA - Université de Pau et des Pays de l'Adour)
    Abstract: Since 2000, the Clean Development Mechanism (CDM) under the Kyoto Protocol has included southern countries in the fight against climate change by encouraging northern countries to make environmentally friendly direct investments at the lowest cost in these developing nations. Even if CDM investments have enjoyed great success, the question of how host countries benefit from these investments seems insufficiently explored. Therefore, this article offers a quantitative assessment of the economic and environmental impacts of CDM investments for the specific case of Mexico. We use a computable general equilibrium model that features environmental topics, to simulate the demand and supply effects induced by these investments. Numerical simulations reveal the growth potential and important fund of development that represents the CDM for Mexico, though the environmental impact appears broadly mixed.
    Keywords: Clean Development Mechanism,Computable general equilibrium,Mexico
    Date: 2018–09–24
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01880342&r=cmp
  6. By: Bell, Peter
    Abstract: This paper demonstrates a simulation method using historical prices for gold over a 40-year period. The simulation method can be used to assess variability in a mine plan. In this example, I use monthly gold data from the Bank of England. The total sample size is approximately 450 monthly data points, from which I consider 11 different continuous subsamples with length 100. Some of these blocks of data are overlap, but they are all different. For each block of data, I preform various calculations for a hypothetical mine plan that produces one ounce of gold per month. I report undiscounted total revenue over the 100-month period with real prices corresponding to different historical episodes and note how gold prices have changed over this 40-year period. I also use monthly price differences from each path to simulate gold price paths all starting with the same initial value, which allows for more apples-apples comparison. I show the Revenue Paths in such cases, report the present value for each path, and include a simple cost model in the mine plan to estimate net present value for each path with standardized initial prices.
    Keywords: Engineering Economics, Mining, Royalties, Finance
    JEL: C0 G0 L72
    Date: 2018–10–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:89420&r=cmp
  7. By: Antoine Bouët (IFPRI - International Food Policy Research Institute, CATT - Centre d'Analyse Théorique et de Traitement des données économiques - UPPA - Université de Pau et des Pays de l'Adour); David Laborde (CATT - Centre d'Analyse Théorique et de Traitement des données économiques - UPPA - Université de Pau et des Pays de l'Adour)
    Abstract: This study offers new conclusions on the economic cost of a failed Doha Round. The first section is devoted to an analysis of how trade policies evolve in the long and medium runs. We show that even under normal economic conditions, policymakers modify tariffs to cope with the evolution of world markets. We then use the MIRAGE Computable General Equilibrium model to assess the potential outcome of the Doha Round, and then examine four protectionist scenarios. Under a scenario where applied tariffs of major economies increase up to the currently bound tariff rates, we find that world trade decreases by 7.7 percent and world welfare drops by US353 bn. We then compare a resort to protectionism when the Doha Development Agenda (DDA) is implemented versus a resort to protectionism when the DDA is not implemented. We find that this trade agreement could prevent the potential loss of US 809 bn of trade, and could therefore act as an efficient multilateral insurance scheme against the adverse consequences of "beggar-thy-neighbor" trade policies.
    Keywords: Trade negociations,CGE modeling,Bound duties,Domestic support
    Date: 2018–10–01
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01885165&r=cmp
  8. By: Yadira Mori-Clement (University of Graz, Austria); Stefan Nabernegg (University of Graz, Austria); Birgit Bednar-Friedl (University of Graz, Austria)
    Abstract: Preferential trade agreements with climate-related provisions have been suggested as alternative to a New Market Mechanism due to its potential not only to achieve Nationally Determined Contributions (NDCs) in emerging economies but also to lead to more ambitious targets in the first UNFCCC global stocktake in 2023. The objective of this research is therefore to analyze the effectiveness and quantify the economic impacts of such a trade agreement between Brazil and the European Union that aims to support renewable electricity generation. Using a multi-regional computable general equilibrium model, we find that the environmental effectiveness of a preferential trade agreement targeting renewable electricity generation strongly depends on its design. In particular, preferential trade agreements require additional elements to effectively contribute to mitigation as the sole removal of import tariffs on renewable energy technology is quite ineffective in scaling up the share of wind, solar, and biomass in Brazil. In contrast, a preferential trade agreement triggering FDI flows towards renewable electricity generation is effective in increasing the share of renewables in the generation mix and in reducing CO2 emissions, while positively affecting the Brazilian economic performance. Finally, we compare the two previous approaches to a domestic energy policy: a combination of higher fossil fuel taxes and subsidies to renewable electricity generation. We find that although this domestic energy policy is more effective in mitigation terms than the FDI policy, economic performance is negatively affected in several sectors. When such economic costs are socially not acceptable, as it is likely in many emerging economies, properly designed preferential trade agreements could therefore be a suitable instrument for supporting the achievement of NDCs, and potentially increase their stringency for the next stock taking period.
    Keywords: Preferential Trade Agreements with climate-related provisions; environmental goods; renewable energy; FDI; emerging economies; Brazil; European Union
    JEL: Q27 Q28 Q42
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:grz:wpaper:2018-19&r=cmp
  9. By: Kotschy, Rainer (LMU Munich); Sunde, Uwe (LMU Munich)
    Abstract: This paper investigates the consequences of population aging and of changes in the education composition of the population for macroeconomic performance. Estimation results from a theoretically founded empirical framework show that aging as well as the education composition of the population influence economic performance. The estimates and simulations based on population projections and different counterfactual scenarios show that population aging will have a substantial negative consequence for macroeconomic performance in many countries in the years to come. The results also suggest that education expansions tend to offset the negative effects, but that the extent to which they compensate the aging effects differs vastly across countries. The simulations illustrate the heterogeneity in the effects of population aging on economic performance across countries, depending on their current age and education composition. The estimates provide a method to quantify the increase in education that is required to offset the negative consequences of population aging. Counterfactual changes in labor force participation and productivity required to neutralize aging are found to be substantial.
    Keywords: demographic change; demographic structure; distribution of skills; projections; education-aging-elasticity;
    JEL: J11 O47
    Date: 2018–10–17
    URL: http://d.repec.org/n?u=RePEc:rco:dpaper:121&r=cmp
  10. By: Gary S. Anderson
    Abstract: This paper provides a new technique for representing discrete time nonlinear dynamic stochastic time invariant maps. Using this new series representation, the paper augments the usual solution strategy with an additional set of constraints thereby enhancing algorithm reliability. The paper also provides general formulas for evaluating the accuracy of proposed solutions. The technique can readily accommodate models with occasionally binding constraints and regime switching. The algorithm uses Smolyak polynomial function approximation in a way which makes it possible to exploit a high degree of parallelism.
    Keywords: Econometric modeling ; Mathematical and quantitative methods
    JEL: C63 C65 C60 C62
    Date: 2018–10–11
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2018-70&r=cmp
  11. By: Michael Mayer; Steven C. Bourassa; Martin Hoesli; Donato Scognamiglio
    Abstract: We use a rich data set consisting of 123,000 houses sold in Switzerland between 2004 and 2017 to investigate different automated valuation techniques in settings where the models are updated regularly. We apply six methods (linear regression, robust regression, mixed effects regression, gradient boosting, random forests, and neural networks) to both moving window and extending window models. With respect to the criteria of appraisal accuracy and stability, the preferred methods are robust regression using moving windows, gradient boosting using extending windows, or mixed effects regression for either strategy.
    Keywords: automated valuation; Machine Learning; Statistics
    JEL: R3
    Date: 2018–01–01
    URL: http://d.repec.org/n?u=RePEc:arz:wpaper:eres2018_40&r=cmp
  12. By: Jonathan Cogliano (Dickinson College); Roberto Veneziani (Queen Mary University of London); Naoki Yoshihara (School of Economics and Management, Kochi University of Technology)
    Abstract: This paper uses a computational framework to analyse the equilibrium dynamics of exploitation and inequality in accumulation economies with heterogeneous labour. A novel index is presented which measures the intensity of exploitation at the individual level and the dynamics of the distribution of exploitation intensity is analysed. Various taxation schemes are analysed which may reduce exploitation or inequalities in income and wealth. It is shown that relatively small taxation rates may have signifi cant cumulative effects on wealth and income inequalities. Further, taxation schemes that eliminate exploitation also reduce disparities in income and wealth but in the presence of heterogeneous skills, do not necessarily eliminate them. The inegalitarian effects of different abilities need to be tackled with a progressive education policy that compensates for unfavourable circumstances.
    Keywords: Exploitation, heterogeneous labour, wealth taxes, computational methods
    JEL: B51 C63 D31
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:kch:wpaper:sdes-2018-14&r=cmp
  13. By: Roman Orus; Samuel Mugel; Enrique Lizaso
    Abstract: A key problem in financial mathematics is the forecasting of financial crashes: if we perturb asset prices, will financial institutions fail on a massive scale? This was recently shown to be a computationally intractable (NP-Hard) problem. Financial crashes are inherently difficult to predict, even for a regulator which has complete information about the financial system. In this paper we show how this problem can be handled by quantum annealers. More specifically, we map the equilibrium condition of a financial network to the ground-state problem of a spin-1/2 quantum Hamiltonian with 2-body interactions, i.e., a Quadratic Unconstrained Binary Optimization (QUBO) problem. The equilibrium market values of institutions after a sudden shock to the network can then be calculated via adiabatic quantum computation and, more generically, by quantum annealers. Our procedure can be implemented on near-term quantum processors, providing a potentially more efficient way to predict financial crashes.
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1810.07690&r=cmp
  14. By: Joshua Chan; Arnaud Doucet; Roberto León-González; Rodney W. Strachan
    Abstract: This paper develops a new methodology that decomposes shocks into homoscedastic and heteroscedastic components. This specification implies there exist linear combinations of heteroscedastic variables that eliminate heteroscedasticity. That is, these linear combinations are homoscedastic; a property we call co-heteroscedasticity. The heteroscedastic part of the model uses a multivariate stochastic volatility inverse Wishart process. The resulting model is invariant to the ordering of the variables, which we show is important for impulse response analysis but is generally important for, e.g., volatility estimation and variance decompositions. The specification allows estimation in moderately high-dimensions. The computational strategy uses a novel particle filter algorithm, a reparameterization that substantially improves algorithmic convergence and an alternating-order particle Gibbs that reduces the amount of particles needed for accurate estimation. We provide two empirical applications; one to exchange rate data and another to a large Vector Autoregression (VAR) of US macroeconomic variables. We find strong evidence for co-heteroscedasticity and, in the second application, estimate the impact of monetary policy on the homoscedastic and heteroscedastic components of macroeconomic variables.
    Keywords: Markov Chain Monte Carlo, Gibbs Sampling, Flexible Parametric Model, Particle Filter, Co-heteroscedasticity, state-space, reparameterization, alternating-order
    JEL: C11 C15
    Date: 2018–10
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2018-52&r=cmp
  15. By: Vasilev, Aleksandar
    Abstract: This paper explores the effects of fiscal policy in an economy with Epstein-Zin (1989, 1991) preferences, with indirect (consumption) taxes, and all (labor and capital) in- come being taxed at the same rate. To this end, a dynamic general-equilibrium model, calibrated to Bulgarian data (1999-2016), is augmented with a government sector. Two regimes are compared and contrasted - the exogenous (observed) vs. optimal policy (Ramsey) case. The focus of the paper is on the relative importance of consumption vs. income taxation, as well as on the provision of utility-enhancing public services. The main findings from the computational experiments performed in the paper are: (i) The optimal steady-state income tax rate is zero; (ii) The benevolent Ramsey planner provides the optimal amount of the utility-enhancing public services, which are now 25% higher; (iii) The optimal steady-state consumption tax needed to finance the optimal level of government spending is more than fifty percent higher, as compared to the exogenous policy case.
    Keywords: Epstein-Zin preferences,consumption tax,income tax,general equilibrium,optimal (Ramsey) fiscal policy,Bulgaria
    JEL: D58
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:zbw:esprep:183134&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.