nep-cmp New Economics Papers
on Computational Economics
Issue of 2017‒10‒15
fourteen papers chosen by



  1. Systematic sensitivity analysis of the full economic impacts of sea level rise By T. Chatzivasileiadis; F. Estrada; M. W. Hofkes; R. S. J. Tol
  2. High Frequency Market Making with Machine Learning By Matthew F Dixon
  3. Fundamentals unknown: Momentum, mean-reversion and price-to-earnings trading in an artificial stock market By Schasfoort, Joeri; Stockermans, Christopher
  4. Microsimulation tools for the evaluation of fiscal policy reforms at the Banco de España By Olympia Bover; José María Casado; Esteban García-Miralles; Roberto Ramos; José María Labeaga
  5. The Imact of Brexit on Foreign Investment and Production By Andrea Waddle; Ellen McGrattan
  6. ETLA macro model for forecasting and policy simulations By Lehmus, Markku
  7. Measuring the Efficiency of VAT reforms: Evidence from Slovakia By Andrej Cupák; Peter Tóth
  8. Policy experiments in an agent-based model with credit networks By Assenza, Tiziana; Cardaci, Alberto; Delli Gatti, Domenico; Grazzini, Jakob
  9. Endogenous Regime Switching Near the Zero Lower Bound By Lansing, Kevin J.
  10. Shared Mobility Simulations for Helsinki By ITF
  11. Maximizing the expected net present value of a project with phase-type distributed activity durations: an efficient globally optimal solution procedure By Stefan Creemers
  12. Quantum Path-Integral qPATHINT Algorithm By L. Ingber
  13. Sparse Portfolio Selection via the sorted $\ell_{1}$-Norm By Philipp J. Kremer; Sangkyun Lee; Malgorzata Bogdan; Sandra Paterlini
  14. Inference on Auctions with Weak Assumptions on Information By Vasilis Syrgkanis; Elie Tamer; Juba Ziani

  1. By: T. Chatzivasileiadis (Institute for Environmental Studies, Vrije Universiteit, Amsterdam); F. Estrada (Centro de Ciencias de la Atmosfera, Universidad Nacional Autonoma de Mexico, Ciudad Universitaria, Mexico; Institute for Environmental Studies, Vrije Universiteit, Amsterdam); M. W. Hofkes (Department of Economics, Vrije Universiteit, Amsterdam; Institute for Environmental Studies, Vrije Universiteit, Amsterdam; Department of Spatial Economics, Vrije Universiteit, Amsterdam); R. S. J. Tol (Institution Department of Economics, University of Sussex; Institute for Environmental Studies, Vrije Universiteit, Amsterdam; Department of Spatial Economics, Vrije Universiteit, Amsterdam; Tinbergen Institute, Amsterdam; CESifo, Munich)
    Abstract: The potential impacts of Sea Level Rise (SLR) due to climate change have been widely studied in the literature. However, the uncertainty and robustness of these estimates has seldom been explored. Here we assess the model input uncertainty regarding the wide effects of SLR on marine navigation from a global economic perspective. We systematically assess the robustness of Computable General Equilibrium (CGE) estimates to model’s inputs uncertainty. Monte Carlo (MC) and Gaussian Quadrature (GQ) methods are used for conducting a Systematic Sensitivity Analysis (SSA). This design allows to both explore the sensitivity of the CGE model and to compare the MC and GQ methods. Results show that, regardless whether triangular or piecewise linear Probability distributions are used, the welfare losses are higher in the MC SSA than in the original deterministic simulation. This indicates that the CGE economic literature has potentially underestimated the total economic effects of SLR, thus stressing the necessity of SSA when simulating the general equilibrium effects of SLR. The uncertainty decomposition shows that land losses have a smaller effect compared to capital and seaport productivity losses. Capital losses seem to affect the developed regions GDP more than the productivity losses do. Moreover, we show the uncertainty decomposition of the MC results and discuss the convergence of the MC results for a decomposed version of the CGE model. This paper aims to provide standardised guidelines for stochastic simulation in the context of CGE modelling that could be useful for researchers in similar settings.
    Keywords: CGE, Sea Level Rise, Systematic Sensitivity Analysis, Monte Carlo, GTAP
    JEL: C68 Q54
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:sus:susewp:1617&r=cmp
  2. By: Matthew F Dixon
    Abstract: High frequency trading has been characterized as an arms race with 'Red Queen' characteristics [Farmer,2012]. It is improbable, even impossible, that many market participants can sustain a competitive advantage through the sole reliance on low latency trade execution systems. The growth in volume of market data, advances in computer hardware and commensurate prominence of machine learning in other disciplines, have spurred the exploration of machine learning for price discovery. Even though the application of machine learning to price prediction has been extensively researched, the merit of this approach for high frequency market making has received little attention. This paper introduces a trade execution model to evaluate the economic impact of classifiers through backtesting. Extending the concept of confusion matrix, we present a 'trade information matrix' to attribute the expected profit and loss of tick level predictive classifiers under execution constraints, such as fill probabilities and position dependent trade rules, to correct and incorrect predictions. We apply the execution model and trade information matrix to Level II E-mini S&P 500 futures history and demonstrate an estimation approach for measuring the sensitivity of the P&L to classification error. We describe the training of a recurrent neural network (RNN) and show (i) there is little gain from re-training the model on a frequent basis; (ii) that there are distinct intra-day classifier performance trends; and (iii) classifier accuracy quickly erodes with the length of prediction horizon. Our findings suggest that our computationally tractable approach can be used to directly evaluate the performance sensitivity of a market making strategy to classifier error and can augment traditional market simulation based testing.
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1710.03870&r=cmp
  3. By: Schasfoort, Joeri; Stockermans, Christopher
    Abstract: The use of fundamentalist traders in the stock market models is problematic since fundamental values in the real world are unknown. Yet, in the literature to date, fundamentalists are often required to replicate key stylized facts. The authors present an agent-based model of the stock market in which the fundamental value of the asset is unknown. They start with a zero intelligence stock market model with a limit-order-book. Then, the authors add technical traders which switch between a simple momentum and mean reversion strategy depending on its relative profitability. Technical traders use the price to earnings ratio as a proxy for fundamentals. If price to earnings are either too high or too low, they sell or buy, respectively.
    Keywords: Agent-based modelling,financial markets,technical and fundamental analysis,asset pricing
    JEL: C63 D53 D84 G12 G17
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201763&r=cmp
  4. By: Olympia Bover (Banco de España); José María Casado (Banco de España); Esteban García-Miralles (Banco de España); Roberto Ramos (Banco de España); José María Labeaga (UNED)
    Abstract: This paper presents the microsimulation models developed at the Banco de España for the study of fiscal reforms, describing the tool used to evaluate changes in the Spanish personal income tax and also the one for the value added tax and excise duties. In both cases the structure, data and output of the model are detailed and its capabilities are illustrated using simple examples of hypothetical tax reforms, presented only to illustrate the use of these simulation tools.
    Keywords: microsimulation, Spain, personal income tax, value added tax, excise duties
    JEL: C81 D12 H20
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:bde:opaper:1707&r=cmp
  5. By: Andrea Waddle (University of Richmond); Ellen McGrattan (University of Minnesota)
    Abstract: In this paper, we estimate the impact of increasing costs on foreign producers following a withdrawal of the United Kingdom from the European Union (popularly known as Brexit). Our predictions are based on simulations of a multicountry neoclassical growth model that includes multinational firms investing in research and development (R&D), brands, and other intangible capital that is used nonrivalrously by their subsidiaries at home and abroad. We analyze several post-Brexit scenarios. First, we assume that the United Kingdom unilaterally imposes tighter restrictions on foreign direct investment (FDI) from other E.U. nations. With less E.U. technology deployed in the United Kingdom, U.K. firms increase investment in their own R&D and other intangibles, which is costly, and welfare for U.K. citizens is lower. If the European Union remains open, its citizens enjoy a modest gain from the increased U.K. investment since it can be costlessly deployed in subsidiaries throughout Europe. If instead we assume that the European Union imposes the same restrictions on U.K. FDI, then E.U. firms invest more in their own R&D, benefiting the United Kingdom. With costs higher on both U.K. and E.U. FDI, we predict a significant fall in foreign investment and production by U.K. firms. The United Kingdom increases international lending, which finances the production of others both domestically and abroad, and inward FDI rises. U.K. consumption falls and leisure rises, implying a negligible impact on welfare. In the European Union, declines in investment and production are modest, but the welfare of E.U. citizens is significantly lower. Finally, if, during the transition, the United Kingdom reduces current restrictions on other major foreign investors, such as the United States and Japan, U.K. inward FDI and welfare both rise significantly.
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:red:sed017:710&r=cmp
  6. By: Lehmus, Markku
    Abstract: This paper presents a review of a quarterly macroeconomic model built for forecasting and policy simulation purposes at the Research Institute of the Finnish Economy (ETLA). The ETLA model can be labelled as a structural econometric macro model (also known as “SEM” or “policy model” in the recent literature). The ETLA model constitutes of 81 endogenous and 70 exogenous variables and hence at this stage, it is relatively small in size. The model encompasses Keynesian features in the short run, albeit particular attention is paid to its long-term equilibrium properties which are defined from supply side. Owing to these characteristics, its adjustment to external/policy shocks resembles the behavior of New Keynesian DSGE models with sticky prices and wages. The agents of the model are partly forward-looking.
    Date: 2017–10–02
    URL: http://d.repec.org/n?u=RePEc:rif:wpaper:54&r=cmp
  7. By: Andrej Cupák (National Bank of Slovakia); Peter Tóth (National Bank of Slovakia)
    Abstract: We estimate a demand system to simulate the welfare and fiscal impacts of the recent value added tax (VAT) cut on selected foods in Slovakia. We evaluate the efficiency of the tax cut vis-a-vis its hypothetical alternatives using the ratio of the welfare and fiscal impacts. Based on our findings, tax cuts tend to be more efficient if demand for a good is price-elastic or if the good has several complements. The results also indicate that cherry-picking from food sub-categories could have improved the efficiency of the recent tax change. Further, we found potential revenue-neutral welfare-improving tax schemes, namely, a reduced rate on foods financed by an increased rate on non-foods improves welfare in case of most food types. The paper contributes to the literature by demonstrating that standard approximate efficiency indicators of VAT reforms are biased compared with simulation-based results for any plausible degree of a tax change.
    Keywords: Consumer behavior; Demand system; QUAIDS; Value added tax; Tax reform; Efficiency; Optimal taxation; Slovakia
    JEL: D12 E21 H21 I31
    Date: 2017–09
    URL: http://d.repec.org/n?u=RePEc:svk:wpaper:1047&r=cmp
  8. By: Assenza, Tiziana; Cardaci, Alberto; Delli Gatti, Domenico; Grazzini, Jakob
    Abstract: In this paper the authors build upon Assenza et al. (Credit networks in the macroeconomics from the bottom-up model, 2015), which include firm-bank and bank-bank networks in the original macroeconomic model in Macroeconomics from the bottom-up (Delli Gatti et al., Macroeconomics from the Bottom-up, 2011). In particular, they extend that framework with the inclusion of a public sector and other modifications in order to carry out different policy experiments. More specifically, the authors test the implementation of a monetary policy by means of a standard Taylor rule, an unconventional monetary policy (i.e. cash in hands) and a set of macroprudential regulations. They explore the properties of the model for such different scenarios. Their results shed some light on the effectiveness of monetary and macroprudential policies in an economy with an interbank market during times of crises.
    Keywords: Agent-based models,monetary policy,credit network
    JEL: C63 E51 E52
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:zbw:ifwedp:201766&r=cmp
  9. By: Lansing, Kevin J. (Federal Reserve Bank of San Francisco)
    Abstract: This paper develops a New Keynesian model with a time-varying natural rate of inter-est (r-star) and a zero lower bound (ZLB) on the nominal interest rate. The representative agent contemplates the possibility of an occasionally binding ZLB that is driven by switching between two local rational expectations equilibria, labeled the "targeted" and "deflation" solutions, respectively. Sustained periods when the real interest rate remains below the central bank's estimate of r-star can induce the agent to place a substantially higher weight on the deflation equilibrium, causing it to occasionally become self-fulling. I solve for the time series of stochastic shocks and endogenous forecast rule weights that allow the model to exactly replicate the observed time paths of the U.S. output gap and quarterly inflation since 1988. In model simulations, raising the central bank's inflation target to 4% from 2% can reduce, but not eliminate, the endogenous switches to the deflation equilibrium.
    JEL: E31 E43 E52
    Date: 2017–09–28
    URL: http://d.repec.org/n?u=RePEc:fip:fedfwp:2017-24&r=cmp
  10. By: ITF
    Abstract: This report examines how the optimised use of new on-demand shared transport modes could change the future of mobility in the Helsinki Metropolitan Area in Finland. Based on simulation, it provides indicators for the impact of shared mobility solutions on accessibility, metro/rail ridership, required parking space, congestion and CO2 emissions. The model also analyses service quality, efficiency and cost competitiveness of the shared solutions. In addition, the report explores the willingness among the citizens of the Helsinki region to adopt shared mobility solutions based on focus group analysis. The findings provide an evidence base for decision makers to weigh opportunities and challenges created by new forms of shared transport services. The work is part of a series of studies on shared mobility in different urban and metropolitan contexts. This report is part of the International Transport Forum’s Case-Specific Policy Analysis series. These are topical studies on specific issues carried out by the ITF in agreement with local institutions.
    Date: 2017–10–12
    URL: http://d.repec.org/n?u=RePEc:oec:itfaac:39-en&r=cmp
  11. By: Stefan Creemers
    Abstract: We study projects with activities that have stochastic durations that are modeled using phase-type distributions. Intermediate cash flows are incurred during the execution of the project. Upon completion of all project activities a payoff is obtained. Because activity durations are stochastic, activity starting times cannot be defined at the start of the project. Instead, we have to rely on a policy to schedule activities during the execution of the project. The optimal policy schedules activities such that the expected net present value of the project is maximized. We determine the optimal policy using a new continuous-time Markov chain and a backward stochastic dynamic program. Although the new continuous-time Markov chain allows to drastically reduce memory requirements (when compared to existing methods), it also allows activities to be preempted; an assumption that is not always desirable. We demonstrate, however, that it is globally optimal not to preempt activities if certain conditions are met. A computational experiment confirms this finding. The computational experiment also shows that we significantly outperform current state-of-the-art procedures. On average, we improve computational efficiency by a factor of 600, and reduce memory requirements by a factor of 321.
    Keywords: Project Scheduling, Project Management, NPV maximization, SNPV, Stochastic activity durations
    Date: 2017–09
    URL: http://d.repec.org/n?u=RePEc:ete:kbiper:592798&r=cmp
  12. By: L. Ingber
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:lei:ingber:17qa&r=cmp
  13. By: Philipp J. Kremer; Sangkyun Lee; Malgorzata Bogdan; Sandra Paterlini
    Abstract: We introduce a financial portfolio optimization framework that allows us to automatically select the relevant assets and estimate their weights by relying on a sorted $\ell_1$-Norm penalization, henceforth SLOPE. Our approach is able to group constituents with similar correlation properties, and with the same underlying risk factor exposures. We show that by varying the intensity of the penalty, SLOPE can span the entire set of optimal portfolios on the risk-diversification frontier, from minimum variance to the equally weighted. To solve the optimization problem, we develop a new efficient algorithm, based on the Alternating Direction Method of Multipliers. Our empirical analysis shows that SLOPE yields optimal portfolios with good out-of-sample risk and return performance properties, by reducing the overall turnover through more stable asset weight estimates. Moreover, using the automatic grouping property of SLOPE, new portfolio strategies, such as SLOPE-MV, can be developed to exploit the data-driven detected similarities across assets.
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1710.02435&r=cmp
  14. By: Vasilis Syrgkanis; Elie Tamer; Juba Ziani
    Abstract: Given a sample of bids from independent auctions, this paper examines the question of inference on auction fundamentals (e.g. valuation distributions, welfare measures) under weak assumptions on information structure. The question is important as it allows us to learn about the valuation distribution in a robust way, i.e., without assuming that a particular information structure holds across observations. We leverage recent contributions in the robust mechanism design literature that exploit the link between Bayesian Correlated Equilibria and Bayesian Nash Equilibria in incomplete information games to construct an econometrics framework for learning about auction fundamentals using observed data on bids. We showcase our construction of identified sets in private value and common value auctions. Our approach for constructing these sets inherits the computational simplicity of solving for correlated equilibria: checking whether a particular valuation distribution belongs to the identified set is as simple as determining whether a linear program is feasible. A similar linear program can be used to construct the identified set on various welfare measures and counterfactual objects. For inference and to summarize statistical uncertainty, we propose novel finite sample methods using tail inequalities that are used to construct confidence regions on sets. We also highlight methods based on Bayesian bootstrap and subsampling. A set of Monte Carlo experiments show adequate finite sample properties of our inference procedures. We also illustrate our methods using data from OCS auctions.
    Date: 2017–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1710.03830&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.