nep-cmp New Economics Papers
on Computational Economics
Issue of 2018‒08‒13
nine papers chosen by



  1. Thresholded ConvNet Ensembles: Neural Networks for Technical Forecasting By Sid Ghoshal; Stephen J. Roberts
  2. Robustness Analysis of a Website Categorization Procedure based on Machine Learning By Renato Bruni; Gianpiero Bianchi
  3. Towards incorporating natural capital into a computable general equilibrium model for Scotland By Grant Allan; David Comerford; Peter McGregor
  4. Stabilizing an Unstable Complex Economy By Isabelle Salle; Pascal Seppecher
  5. Optimal Portfolio in Intraday Electricity Markets Modelled by L\'evy-Ornstein-Uhlenbeck Processes By Marco Piccirilli; Tiziano Vargiolu
  6. Toward a New Microfounded Macroeconomics in the Wake of the Crisis By Eugenio Caverzasi; Alberto Russo
  7. Simulating the potential of swarm grids for pre-electrified communities - A case study from Yemen By Hoffmann, Martha M.; Ansari, Dawud
  8. A Machine Learning Approach to the Forecast Combination Puzzle By Antoine Mandel; Amir Sani
  9. Betas, Benchmarks and Beating the Market By Zura Kakushadze; Willie Yu

  1. By: Sid Ghoshal; Stephen J. Roberts
    Abstract: Much of modern practice in financial forecasting relies on technicals, an umbrella term for several heuristics applying visual pattern recognition to price charts. Despite its ubiquity in financial media, the reliability of its signals remains a contentious and highly subjective form of 'domain knowledge'. We investigate the predictive value of patterns in financial time series, applying machine learning and signal processing techniques to 22 years of US equity data. By reframing technical analysis as a poorly specified, arbitrarily preset feature-extractive layer in a deep neural network, we show that better convolutional filters can be learned directly from the data, and provide visual representations of the features being identified. We find that an ensemble of shallow, thresholded CNNs optimised over different resolutions achieves state-of-the-art performance on this domain, outperforming technical methods while retaining some of their interpretability.
    Date: 2018–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1807.03192&r=cmp
  2. By: Renato Bruni (Department of Computer, Control and Management Engineering Antonio Ruberti (DIAG), University of Rome La Sapienza, Rome, Italy); Gianpiero Bianchi (Direzione centrale per la metodologia e disegno dei processi statistici (DCME),Italian National Institute of Statistics Istat, Rome, Italy)
    Abstract: Website categorization has recently emerged as a very important task in several contexts. A huge amount of information is freely available through websites, and it could be used to accomplish statistical surveys, saving the cost of the surveys, or to validate already surveyed data. However, the information of interest for the specific categorization has to be mined among that huge amount. This turns out to be a dicult task in practice. This work describes techniques that can be used to convert website categorization into a supervised classification problem. To do so, each data record should summarize the content of an entire website. We generate this kind of records by using web scraping and optical character recognition, followed by a number of automated feature engineering steps. When such records have been produced, we apply to them state-of-the-art classification techniques to categorize the websites according to the aspect of interest. We use Support Vector Machines, Random Forest and Logistic classifiers. Since in many applicative cases the labels available for the training set may be noisy, we analyze the robustness of our procedure with respect to the presence of misclassified training records. We present results on real-world data for the problem of the detection of websites providing e-commerce facilities.
    Keywords: Classification ; Machine Learning ; Feature Engineering ; Text
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:aeg:report:2018-04&r=cmp
  3. By: Grant Allan (Department of Economics, University of Strathclyde); David Comerford (Department of Economics, University of Strathclyde); Peter McGregor (Department of Economics, University of Strathclyde)
    Abstract: Natural capital encompasses those assets which are provided by nature and which are valued by economic actors. As such, there is a clear analogy between natural and other assets, such as physical capital, which are routinely included in models of national economies. However, the valuation of natural assets, to the extent that they are included in such economic models, is typically wrapped up in physical capital along with land values or not valued at all. This could be simply a measurement problem - natural capital might be difficult to appropriately disaggregate from other capital - or because they provide non-market goods which are not included within traditional measures of economic output. The purpose of this paper is to set out - both conceptually and practically - how natural capital can be added to a computable general equilibrium (CGE) model. We focus on: the conceptual differences that should be reflected in such an extension; the challenges of implementing the extension in practice; and identifying the value added generated by an appropriately augmented model. We explore the empirical implementation of our approach through the addition of carbon emissions and an agricultural biomass ecosystem service flow to our CGE model of the Scottish economy. This working paper specifies this CGE model development, but does not go as far as fully implementing it in the CGE model. When fully implemented in the context of a CGE with a disaggregated agriculture sector, this will allow us simultaneously to track the impact of disturbances, including policy changes, on the economy and the environment and therefore on sustainable development. In the longer-term comprehensive coverage of natural capital stocks and ecosystem services will allow us to track the impact of disturbances, including policy interventions, on Green GDP and Genuine Savings, as well as on aggregate and sectoral economic activity, energy use and emissions.
    Keywords: Natural capital, computable general equilibrium models
    JEL: Q57 Q1 C68
    Date: 2018–07
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1808&r=cmp
  4. By: Isabelle Salle (Utrecht School of Economics - Utrecht University [Utrecht]); Pascal Seppecher (CEPN - Centre d'Economie de l'Université Paris Nord - UP13 - Université Paris 13 - USPC - Université Sorbonne Paris Cité - CNRS - Centre National de la Recherche Scientifique)
    Abstract: This paper analyzes a range of alternative specifications of the interest rate policy rule within a macroeconomic, stock-flow consistent, agent-based model. In this model, firms' leverage strategies evolve under the selection pressure of market competition. The resulting process of collective adaptation generates endogenous booms and busts along credit cycles. As feedback loops on aggregate demand affect the goods and the labor markets, the real and the financial sides of the economy are closely interconnected. The baseline scenario is able to qualitatively reproduce a wide range of stylized facts, and to match quantitative orders of magnitude of the main economic indicators. We find that, despite the implementation of credit and balance sheet related prudential policies, the emerging dynamics feature strong instability. Targeting movements in the net worth of firms help dampen the credit cycles, and simultaneously reduce financial and macroeconomic volatility, but does not eliminate the occurrence of financial crises along with high costs in terms of unemployment.
    Keywords: Agent-based modeling, Credit cycles, Monetary and Macroprudential policies, Leaning against the wind
    Date: 2017–05–25
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-01527740&r=cmp
  5. By: Marco Piccirilli; Tiziano Vargiolu
    Abstract: We study an optimal portfolio problem designed for an agent operating in intraday electricity markets. The investor is allowed to trade in a single risky asset modelling the continuously traded power and aims to maximize the expected terminal utility of his wealth. We assume a mean-reverting additive process to drive the power prices. In the case of logarithmic utility, we reduce the fully non-linear Hamilton-Jacobi-Bellman equation to a linear parabolic integro-differential equation, for which we explicitly exhibit a classical solution in two cases of modelling interest. The optimal strategy is given implicitly as the solution of an integral equation, which is possible to solve numerically as well as to describe analytically. An analysis of two different approximations for the optimal policy is provided. Finally, we perform a numerical test by adapting the parameters of a popular electricity spot price model.
    Date: 2018–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1807.01979&r=cmp
  6. By: Eugenio Caverzasi; Alberto Russo
    Abstract: The Great Recession that followed the financial crisis of 2007 is not only the largest economic crisis after the Great Depression of the 1930s, it also signals a crisis of economics as a discipline. This is not only the consequence of the inadequacy of mainstream macroeconomics, and specically the DSGE workhorse model, to forecast such a huge event, or at least to detect the worrying tendencies towards it. Even more relevant is the choice to explicitly avoid the modelling of large crises (that for someone is a motivation for not attacking pre-crisis DSGE models focused on the analysis of small deviations from the steady-state), so denying the intrinsic nature of capitalism, a system that necessarily proceeds through cycles and (extended) crises. The replies of the DSGE approach to critics have led to extensions regarding for instance the role of financial frictions, heterogeneous agents, and bounded rationality (though typically in the form of quasi-rational expectations). The alternative paradigm of Agent-Based Macroeconomics can take into account all these elements at once within an evolutionary modelling framework based on heterogeneity and interaction, so capable to endogenously reproduce complex dynamics, from small fluctuations to large crises, due to innovation and industrial dynamics, rising inequality and financial instability, and so on. The integration between Agent-Based Macroeconomics and the (post-Keynesian) Stock-Flow Consistent approach represents a promising way for the future development of this research field.
    Date: 2018–08–01
    URL: http://d.repec.org/n?u=RePEc:ssa:lemwps:2018/23&r=cmp
  7. By: Hoffmann, Martha M.; Ansari, Dawud
    Abstract: Swarm grids are an emerging approach for electrification in the Global South that interconnects individual household generation and storage to a small electricity network for making full use of existing generation capacities. Using a simulation tool for demand, weather, and power flows, we analyse the potential of an AC swarm grid for a large preelectrified village in rural Yemen. Service quality and financial indicators are compared to the cases of individual supply and a centralised micro grid. While the swarm grid would, in fact, improve supply security from currently 12.4 % (Tier 2) to 81.7 % (Tier 3) at lower levelised costs, it would be inferior to the micro grid in both service (Tier 4) and costs. This is mainly driven by the large pre-installed fossil-fuel generator and storage capacities in our case study. However, this situation may be representative for other relevant locations. Under these conditions, a swarm grid poses the danger to create (possibly-undesired) incentives to invest in diesel generators, and it may fail to support prosumerism effectively. Nevertheless, the swarm’s evolutionary nature with the possibility for staggered investments (e.g. in smaller yet complementary groups of consumers) poses a central advantage over micro grids in the short-term alleviation of energy poverty.
    Keywords: Swarm electrification; swarm grid; micro grid; energy access; distributed generation; Yemen
    JEL: C63 O13 O18 Q42 Q49
    Date: 2018–07–23
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:88166&r=cmp
  8. By: Antoine Mandel (CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique); Amir Sani (CFM-Imperial Institute of Quantitative Finance - Imperial College London, CES - Centre d'économie de la Sorbonne - UP1 - Université Panthéon-Sorbonne - CNRS - Centre National de la Recherche Scientifique)
    Abstract: Forecast combination algorithms provide a robust solution to noisy data and shifting process dynamics. However in practice, sophisticated combination methods often fail to consistently outperform the simple mean combination. This "forecast combination puzzle" limits the adoption of alternative com- bination approaches and forecasting algorithms by policy-makers. Through an adaptive machine learning algorithm designed for streaming data, this pa- per proposes a novel time-varying forecast combination approach that retains distribution-free guarantees in performance while automatically adapting com- binations according to the performance of any selected combination approach or forecaster. In particular, the proposed algorithm offers policy-makers the ability to compute the worst-case loss with respect to the mean combination ex-ante, while also guaranteeing that the combination performance is never worse than this explicit guarantee. Theoretical bounds are reported with re- spect to the relative mean squared forecast error. Out-of-sample empirical performance is evaluated on the Stock and Watson seven-country dataset and the ECB Sur- vey of Professional Forecasters.
    Keywords: Forecasting,Forecast Combination Puzzle,Forecast combinations,Machine Learning,Econometrics,Apprentissage statistique,Combinaison de prédicteurs,Econométrie
    Date: 2017–04–19
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:halshs-01317974&r=cmp
  9. By: Zura Kakushadze; Willie Yu
    Abstract: We give an explicit formulaic algorithm and source code for building long-only benchmark portfolios and then using these benchmarks in long-only market outperformance strategies. The benchmarks (or the corresponding betas) do not involve any principal components, nor do they require iterations. Instead, we use a multifactor risk model (which utilizes multilevel industry classification or clustering) specifically tailored to long-only benchmark portfolios to compute their weights, which are explicitly positive in our construction.
    Date: 2018–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1807.09919&r=cmp

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.