nep-cmp New Economics Papers
on Computational Economics
Issue of 2019‒10‒28
twelve papers chosen by



  1. Digestate Evaporation Treatment in Biogas Plants: A Techno-economic Assessment by Monte Carlo, Neural Networks and Decision Trees By Vondra, Marek; Touš, Michal; Teng, Sin Yong
  2. Catalytic thermal degradation of Chlorella Vulgaris: Evolving deep neural networks for optimization By Teng, Sin Yong; Loy, Adrian Chun Minh; Leong, Wei Dong; How, Bing Shen; Chin, Bridgid Lai Fui; Máša, Vítězslav
  3. Anomaly Detection in High Dimensional Data By Priyanga Dilini Talagala; Rob J Hyndman; Kate Smith-Miles
  4. Mesoscale impact of trader psychology on stock markets: a multi-agent AI approach By J. Lussange; S. Palminteri; S. Bourgeois-Gironde; B. Gutkin
  5. Forecasting Observables with Particle Filters: Any Filter Will Do! By Patrick Leung; Catherine S. Forbes; Gael M Martin; Brendan McCabe
  6. How is Machine Learning Useful for Macroeconomic Forecasting? By Philippe Goulet Coulombe; Maxime Leroux; Dalibor Stevanovic; Stéphane Surprenant
  7. Adaptive-Aggressive Traders Don't Dominate By Daniel Snashall; Dave Cliff
  8. ZRP Routing Protocol Performance Improvement using Fuzzy based Radius Approach By Nassir Harrag; Abdelghani Harrag
  9. A new unit root analysis for testing hysteresis in unemployment By Yaya, OlaOluwa S; Ogbonna, Ephraim A; Furuoka, Fumitaka; Gil-Alana, Luis A.
  10. Crisis transmission: visualizing vulnerability By Dungey, Mardi; Islam, Raisul; Volkov, Vladimir
  11. The Wrong Kind of AI? Artificial Intelligence and the Future of Labor Demand By Acemoglu, Daron; Restrepo, Pascual
  12. Beating the House: Identifying Inefficiencies in Sports Betting Markets By Sathya Ramesh; Ragib Mostofa; Marco Bornstein; John Dobelman

  1. By: Vondra, Marek; Touš, Michal; Teng, Sin Yong
    Abstract: Biogas production is one of the most promising pathways toward fully utilizing green energy within a circular economy. The anaerobic digestion process is the industry standard technology for biogas production due to its lowered energy consumption and its reliance on microbiology. Even in such an environmental-friendly process, liquid digestate is still produced from the remains of digested bio-feedstock and will require treatment. With unsuitable treatment procedure for liquid digestate, the mass of bio-feedstock can potentially escape the circular supply chain within the economy. This paper recommends the implementation of evaporator systems to provide a sustainable liquid digestate treating mechanism within the economy. Studied evaporator systems are represented by vacuum evaporation in combination with ammonia scrubber, stripping and reverse osmosis. Nevertheless, complex multi-dimensional decisions should be made by stakeholders before implementing such systems. Our work utilizes a novel techno-economics model to study the techno-economics robustness in implementing recent state-of-art vacuum evaporation systems with exploitation of waste heat from combined heat and power (CHP) units in biogas plants (BGP). To take into the account the stochasticity of the real world and robustness of the analysis, we used the Monte-Carlo simulation technique to generate more than 20,000 of different possibilities for the implementation of the evaporation system. Favourable decision pathways are then selected using a novel methodology which utilizes the artificial neural network and a hyper-optimized decision tree classifier. Two pathways that give the highest probability of providing a fast payback period are identified. Descriptive statistics are also used to analyse the distributions of decision parameters that lead to success in implementing the evaporator system. The results highlighted that integration of evaporation system are favourable when transport costs and incentives for CHP units are large and while feed-in tariffs for electricity production and specific investment costs are low. The result of this work is expected to pave the way for BGP stakeholders and decision makers in implementing liquid digestate treating technologies within the currently existing infrastructure.
    Keywords: Anaerobic Digestion; Machine Learning; Vacuum Evaporation; Liquid Digestate; Biogas Plant; Energy Consumption; Nutrient Recovery; Circular economy; Ammonium sulphate solution
    JEL: C0 C1 C6 C8 E0 E2 E3 E6 L1 L6 L9
    Date: 2019–09–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:95770&r=all
  2. By: Teng, Sin Yong; Loy, Adrian Chun Minh; Leong, Wei Dong; How, Bing Shen; Chin, Bridgid Lai Fui; Máša, Vítězslav
    Abstract: The aim of this study is to identify the optimum thermal conversion of Chlorella vulgaris with neuro-evolutionary approach. A Progressive Depth Swarm-Evolution (PDSE) neuro-evolutionary approach is proposed to model the Thermogravimetric analysis (TGA) data of catalytic thermal degradation of Chlorella vulgaris. Results showed that the proposed method can generate predictions which are more accurate compared to other conventional approaches (>90% lower in Root Mean Square Error (RMSE) and Mean Bias Error (MBE)). In addition, Simulated Annealing is proposed to determine the optimal operating conditions for microalgae conversion from multiple trained ANN. The predicted optimum conditions were reaction temperature of 900.0 °C, heating rate of 5.0 °C/min with the presence of HZSM-5 zeolite catalyst to obtain 88.3% of Chlorella vulgaris conversion.
    Keywords: Microalgae; Thermogravimetric analysis; Artificial neuron network; Particle swarm optimization; Simulated Annealing
    JEL: C0 C1 C6 C8 C9 Q2 Q3 Q4 Q5
    Date: 2019–09–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:95772&r=all
  3. By: Priyanga Dilini Talagala; Rob J Hyndman; Kate Smith-Miles
    Abstract: The HDoutliers algorithm is a powerful unsupervised algorithm for detecting anomalies in high-dimensional data, with a strong theoretical foundation. However, it suffers from some limitations that significantly hinder its performance level, under certain circumstances. In this article, we propose an algorithm that addresses these limitations. We define an anomaly as an observation that deviates markedly from the majority with a large distance gap. An approach based on extreme value theory is used for the anomalous threshold calculation. Using various synthetic and real datasets, we demonstrate the wide applicability and usefulness of our algorithm, which we call the stray algorithm. We also demonstrate how this algorithm can assist in detecting anomalies present in other data structures using feature engineering. We show the situations where the stray algorithm outperforms the HDoutliers algorithm both in accuracy and computational time. This framework is implemented in the open source R package stray.
    Keywords: Data stream, high-dimensional data, nearest neighbour searching, unsupervised outlier detection
    JEL: C1 C8 C55
    Date: 2019
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2019-20&r=all
  4. By: J. Lussange; S. Palminteri; S. Bourgeois-Gironde; B. Gutkin
    Abstract: Recent advances in the fields of machine learning and neurofinance have yielded new exciting research perspectives in practical inference of behavioural economy in financial markets and microstructure study. We here present the latest results from a recently published stock market simulator built around a multi-agent system architecture, in which each agent is an autonomous investor trading stocks by reinforcement learning (RL) via a centralised double-auction limit order book. The RL framework allows for the implementation of specific behavioural and cognitive traits known to trader psychology, and thus to study the impact of these traits on the whole stock market at the mesoscale. More precisely, we narrowed our agent design to three such psychological biases known to have a direct correspondence with RL theory, namely delay discounting, greed, and fear. We compared ensuing simulated data to real stock market data over the past decade or so, and find that market stability benefits from larger populations of agents prone to delay discounting and most astonishingly, to greed.
    Date: 2019–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1910.10099&r=all
  5. By: Patrick Leung; Catherine S. Forbes; Gael M Martin; Brendan McCabe
    Abstract: We investigate the impact of filter choice on forecast accuracy in state space models. The filters are used both to estimate the posterior distribution of the parameters, via a particle marginal Metropolis-Hastings (PMMH) algorithm, and to produce draws from the filtered distribution of the final state. Multiple filters are entertained, including two new data-driven methods. Simulation exercises are used to document the performance of each PMMH algorithm, in terms of computation time and the efficiency of the chain. We then produce the forecast distributions for the one-stepahead value of the observed variable, using a fixed number of particles and Markov chain draws. Despite distinct differences in efficiency, the filters yield virtually identical forecasting accuracy, with this result holding under both correct and incorrect specification of the model. This invariance of forecast performance to the specification of the filter also characterizes an empirical analysis of S&P500 daily returns.
    Keywords: Bayesian prediction, particle MCMC; non-Gaussian time series, state space models, unbiased likelihood estimation, sequential Monte Carlo.
    JEL: C11 C22 C58
    Date: 2019
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2019-22&r=all
  6. By: Philippe Goulet Coulombe; Maxime Leroux; Dalibor Stevanovic; Stéphane Surprenant
    Abstract: We move beyond Is Machine Learning Useful for Macroeconomic Forecasting? by adding the how. The current forecasting literature has focused on matching specific variables and horizons with a particularly successful algorithm. To the contrary, we study a wide range of horizons and variables and learn about the usefulness of the underlying features driving ML gains over standard macroeconometric methods. We distinguish 4 so-called features (nonlinearities, regularization, cross-validation and alternative loss function) and study their behavior in both the data-rich and data-poor environments. To do so, we carefully design a series of experiments that easily allow to identify the “treatment” effects of interest. We conclude that (i) more data and nonlinearities are true game-changers for macroeconomic prediction, (ii) the standard factor model remains the best regularization, (iii) cross-validations are not all made equal (but K-fold is as good as BIC) and (iv) one should stick with the standard L2 loss. The forecasting gains of nonlinear techniques are associated with high macroeconomic uncertainty, financial stress and housing bubble bursts. This suggests that Machine Learning is useful for macroeconomic forecasting by mostly capturing important nonlinearities that arise in the context of uncertainty and financial frictions.
    Keywords: Machine Learning,Big Data,Forecasting,
    JEL: C53 C55 E37
    Date: 2019–10–17
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2019s-22&r=all
  7. By: Daniel Snashall; Dave Cliff
    Abstract: For more than a decade Vytelingum's Adaptive-Aggressive (AA) algorithm has been recognized as the best-performing automated auction-market trading-agent strategy currently known in the AI/Agents literature; in this paper, we demonstrate that it is in fact routinely outperformed by another algorithm when exhaustively tested across a sufficiently wide range of market scenarios. The novel step taken here is to use large-scale compute facilities to brute-force exhaustively evaluate AA in a variety of market environments based on those used for testing it in the original publications. Our results show that even in these simple environments AA is consistently out-performed by IBM's GDX algorithm, first published in 2002. We summarize here results from more than one million market simulation experiments, orders of magnitude more testing than was reported in the original publications that first introduced AA. A 2019 ICAART paper by Cliff claimed that AA's failings were revealed by testing it in more realistic experiments, with conditions closer to those found in real financial markets, but here we demonstrate that even in the simple experiment conditions that were used in the original AA papers, exhaustive testing shows AA to be outperformed by GDX. We close this paper with a discussion of the methodological implications of our work: any results from previous papers where any one trading algorithm is claimed to be superior to others on the basis of only a few thousand trials are probably best treated with some suspicion now. The rise of cloud computing means that the compute-power necessary to subject trading algorithms to millions of trials over a wide range of conditions is readily available at reasonable cost: we should make use of this; exhaustive testing such as is shown here should be the norm in future evaluations and comparisons of new trading algorithms.
    Date: 2019–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1910.09947&r=all
  8. By: Nassir Harrag (Ferhat Abbas University Setif 1); Abdelghani Harrag (Ferhat Abbas University Setif 1)
    Abstract: Mobile Ad Hoc Networks is becoming a major immerging technology in the field of telecommunication networks. The mobility of nodes in MANETs induces local connections changes frequently and makes the network topology change constantly. To maintain the local connections up to date and track neighbor relationship between the nodes, each node broadcasts Hello packets at regular intervals which can cause unnecessary traffic in the wireless network reducing by the performance in case of frequent topology changes. This Paper proposes a fuzzy based radius approach in order to improve the ZRP routing protocol performances. The proposed fuzzy logic radius approach uses as inputs energy and speed; while the radius is used as output. Simulation results obtained using NS-2 simulator show that the proposed fuzzy radius approach outperforms the standard routing protocol ZRP regarding all considered metrics reducing by the way the energy consumption.
    Keywords: Ad hoc, MANET, Protocol, Routing, ZRP, Fuzzy logic, Radius
    Date: 2019–10
    URL: http://d.repec.org/n?u=RePEc:sek:iacpro:9412026&r=all
  9. By: Yaya, OlaOluwa S; Ogbonna, Ephraim A; Furuoka, Fumitaka; Gil-Alana, Luis A.
    Abstract: This paper proposes a nonlinear unit root test based on the artificial neural network-augmented Dickey-Fuller (ANN-ADF) test for testing hysteresis in unemployment. In this new unit root test, the linear, quadratic and cubic components of the neural network process are used to capture the nonlinearity in the time-series data. Fractional integration methods, based on linear and nonlinear trends are also used in the paper. By considering five European countries such as France, Italy, Netherland, Sweden, and the United Kingdom, the empirical findings indicate that there is still hysteresis in these countries. Among batteries of unit root tests applied, both the ARNN-ADF and fractional integration tests fail to reject the hypothesis of unemployment hysteresis in all the countries.
    Keywords: Unit root process; Nonlinearity; Neuron network: Time-series; Hysteresis; Unemployment; Europe; Labour market.
    JEL: C22
    Date: 2019–10–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:96621&r=all
  10. By: Dungey, Mardi (Tasmanian School of Business & Economics, University of Tasmania); Islam, Raisul (Tasmanian School of Business & Economics, University of Tasmania); Volkov, Vladimir (Tasmanian School of Business & Economics, University of Tasmania)
    Abstract: This paper develops a means of visualizing the vulnerability of complex systems of financial interactions around the globe using Neural Network clustering techniques. We show how time-varying spillover indices can be translated into two dimensional crisis maps. The crisis maps have the advantage of showing the changing paths of vulnerability, including the direction and extent of the effect between source and affected markets. Using equity market data for 31 global markets over 1998-2017 we provide these crisis maps. These tools help portfolio managers and policy makers to distinguish which of the available tools for crisis management will be most appropriate for the form of vulnerability in play.
    Keywords: systemic risk, networks
    JEL: C3 C32 C45 C53 D85 G10
    Date: 2019
    URL: http://d.repec.org/n?u=RePEc:tas:wpaper:31661&r=all
  11. By: Acemoglu, Daron (MIT); Restrepo, Pascual (Boston University)
    Abstract: Artificial Intelligence is set to influence every aspect of our lives, not least the way production is organized. AI, as a technology platform, can automate tasks previously performed by labor or create new tasks and activities in which humans can be productively employed. Recent technological change has been biased towards automation, with insufficient focus on creating new tasks where labor can be productively employed. The consequences of this choice have been stagnating labor demand, declining labor share in national income, rising inequality and lower productivity growth. The current tendency is to develop AI in the direction of further automation, but this might mean missing out on the promise of the "right" kind of AI with better economic and social outcomes.
    Keywords: automation, artificial intelligence, jobs, inequality, innovation, labor demand, productivity, tasks, technology, wages
    JEL: J23 J24
    Date: 2019–10
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp12704&r=all
  12. By: Sathya Ramesh; Ragib Mostofa; Marco Bornstein; John Dobelman
    Abstract: Inefficient markets allow investors to consistently outperform the market. To demonstrate that inefficiencies exist in sports betting markets, we created a betting algorithm that generates above market returns for the NFL, NBA, NCAAF, NCAAB, and WNBA betting markets. To formulate our betting strategy, we collected and examined a novel dataset of bets, and created a non-parametric win probability model to find positive expected value situations. As the United States Supreme Court has recently repealed the federal ban on sports betting, research on sports betting markets is increasingly relevant for the growing sports betting industry.
    Date: 2019–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1910.08858&r=all

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.