nep-cmp New Economics Papers
on Computational Economics
Issue of 2019‒05‒27
twenty papers chosen by



  1. The Design and Regulation of High Frequency Traders By Daniel Ladley
  2. Trade Protectionism and US Manufacturing Employment By Chunding Li; Jing Wang; John Whalley
  3. Predicting Pulmonary Function Testing from Quantified Computed Tomography Using Machine Learning Algorithms in Patients with COPD By Gawlitza, Joshua; Sturm, Timo; Spohrer, Kai; Henzler, Thomas; Akin, Ibrahim; Schönberg, Stefan; Borggrefe, Martin; Haubenreisser, Holger; Trinkmann, Frederik
  4. Improving warehouse responsiveness by job priority management By Kim, T.Y.
  5. Conformal Prediction Interval Estimations with an Application to Day-Ahead and Intraday Power Markets By Christopher Kath; Florian Ziel
  6. Predicting and Forecasting the Price of Constituents and Index of Cryptocurrency Using Machine Learning By Reaz Chowdhury; M. Arifur Rahman; M. Sohel Rahman; M. R. C. Mahdy
  7. Essays on reporting and information processing By de Kok, Ties
  8. Deep Learning–based Eco-driving System for Battery Electric Vehicles By Wu, Guoyuan; Ye, Fei; Hao, Peng; Esaid, Danial; Boriboonsomsin, Kanok; Barth, Matthew J.
  9. Time Series Analysis and Forecasting of the US Housing Starts using Econometric and Machine Learning Model By Sudiksha Joshi
  10. Fast Security Constraint Unit Commitment by Utilizing Chaotic Crow Search Algorithm By Patel, Abhishek; Anand, Rajesh
  11. Hedging crop yields against weather uncertainties -- a weather derivative perspective By Samuel Asante Gyamerah; Philip Ngare; Dennis Ikpe
  12. Using Spreadsheet-defined Rules for Reasoning in Self-Adaptive Systems By Krupitzer, Christian; Drechsel, Guido; Mateja, Deborah; Pollklasener, Alina; Schrage, Florian; Sturm, Timo; Tomasovic, Aleksandar; Becker, Christian
  13. Transforming Naturally Occurring Text Data Into Economic Statistics: The Case of Online Job Vacancy Postings By Arthur Turrell; Bradley J. Speigner; Jyldyz Djumalieva; David Copple; James Thurgood
  14. Simulating U.S. Business Cash Flow Taxation in a 17-Region Global Model By Seth G. Benzell; Laurence J. Kotlikoff; Guillermo Lagarda; Yifan Ye
  15. Machine Learning Tree and Exact Integration for Pricing American Options in High Dimension By Ludovic Gouden\`ege; Andrea Molent; Antonino Zanette
  16. Convolutional Feature Extraction and Neural Arithmetic Logic Units for Stock Prediction By Shangeth Rajaa; Jajati Keshari Sahoo
  17. Sustainable Investing and the Cross-Section of Maximum Drawdown By Lisa R. Goldberg; Saad Mouti
  18. The IAB-INCHER project of earned doctorates (IIPED): A supervised machine learning approach to identify doctorate recipients in the German integrated employment biography data By Heinisch, Dominik; Koenig, Johannes; Otto, Anne
  19. The Informational Content of the Term-Spread in Forecasting the U.S. Inflation Rate: A Nonlinear Approach By Gogas, Periklis; Papadimitriou, Theophilos; Plakandaras, Vasilios; Gupta, Rangan
  20. A probabilistic interpretation of the constant gain algorithm By Berardi, Michele

  1. By: Daniel Ladley
    Abstract: Central to the ability of a high frequency trader to make money is speed. In order to be first to trading opportunities fi rms invest in the fastest hardware and the shortest connections between their machines and the markets. This, however, is not enough, algorithms must be short, no more than a few lines of code. As a result there is a trade-off in the design of optimal HFT strategies: being the fastest necessitates being less sophisticated. To understand the effect of this tension a computational model is presented that captures latency, both of code execution and information transmission. Trading algorithms are modelled through genetic programmes with longer programmes allowing more sophisticated decisions at the cost of slower execution times. It is shown that depending on the market composition short fast strategies and slower more sophisticated strategies may both be viable and exploit different trading opportunities. The relative pro fits of these different approaches vary, however, slow traders bene t from their presence. A suite of regulations are tested to manage the risks associated with high frequency trading, the majority are found to be ineffective, however, constraining the ratio of orders to trades may be promising.
    Keywords: Finance, Genetic Programming, High Frequency Trading, Strategy Design, Regulation
    Date: 2019–03
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:19/02&r=all
  2. By: Chunding Li; Jing Wang; John Whalley
    Abstract: This paper uses a numerical global general equilibrium model to simulate the possible effects of US initiated trade protection measures on US manufacturing employment. The simulation results show that US trade protection measures do not increase but will instead reduce manufacturing employment, and US losses will further increase if trade partners take retaliatory measures. The mechanism is that although the substitution effects between domestic and foreign goods have positive impacts, the substitution effects between manufacturing and service sectors and the retaliatory effects both have negative influences, therefore the whole effect is that the US will lose manufacturing employment.
    JEL: C68 F16
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:25860&r=all
  3. By: Gawlitza, Joshua; Sturm, Timo; Spohrer, Kai; Henzler, Thomas; Akin, Ibrahim; Schönberg, Stefan; Borggrefe, Martin; Haubenreisser, Holger; Trinkmann, Frederik
    Abstract: Introduction: Quantitative computed tomography (qCT) is an emergent technique for diagnostics and research in patients with chronic obstructive pulmonary disease (COPD). qCT parameters demonstrate a correlation with pulmonary function tests and symptoms. However, qCT only provides anatomical, not functional, information. We evaluated five distinct, partial-machine learning-based mathematical models to predict lung function parameters from qCT values in comparison with pulmonary function tests. Methods: 75 patients with diagnosed COPD underwent body plethysmography and a dose-optimized qCT examination on a third-generation, dual-source CT with inspiration and expiration. Delta values (inspiration—expiration) were calculated afterwards. Four parameters were quantified: mean lung density, lung volume low-attenuated volume, and full width at half maximum. Five models were evaluated for best prediction: average prediction, median prediction, k-nearest neighbours (kNN), gradient boosting, and multilayer perceptron. Results: The lowest mean relative error (MRE) was calculated for the kNN model with 16%. Similar low MREs were found for polynomial regression as well as gradient boosting-based prediction. Other models led to higher MREs and thereby worse predictive performance. Beyond the sole MRE, distinct differences in prediction performance, dependent on the initial dataset (expiration, inspiration, delta), were found. Conclusion: Different, partially machine learning-based models allow the prediction of lung function values from static qCT parameters within a reasonable margin of error. Therefore, qCT parameters may contain more information than we currently utilize and can potentially augment standard functional lung testing.
    Date: 2019–03–21
    URL: http://d.repec.org/n?u=RePEc:dar:wpaper:113226&r=all
  4. By: Kim, T.Y.
    Abstract: Warehouses employ order cut-off times to ensure sufficient time for fulfilment. To satisfy higher consumer expectations, these cut-off times are gradually postponed to improve order responsiveness. Warehouses therefore have to allocate jobs more efficiently to meet compressed response times. Priority job management by means of flow-shop models has been used mainly for manufacturing systems but can also be applied for warehouse job scheduling to accommodate tighter cut-off times. This study investigates which priority rule performs best under which circumstances. The performance of each rule is evaluated in terms of a common cost criterion that integrates the objectives of low earliness, low tardiness, low labour idleness, and low work-in-process stocks. A real-world case study for a warehouse distribution centre of an original equipment manufacturer in consumer electronics provides the input parameters for a simulation study. The simulation outcomes validate several strategies for improved responsiveness. In particular, the critical ratio rule has the fastest flow-time and performs best for warehouse scenarios with expensive products and high labour costs.
    Keywords: responsiveness, queuing model, order fulfilment, cut-off operation, flow-shop scheduling
    Date: 2018–01–01
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:112492&r=all
  5. By: Christopher Kath; Florian Ziel
    Abstract: We discuss a concept denoted as Conformal Prediction (CP) in this paper. While initially stemming from the world of machine learning, it was never applied or analyzed in the context of short-term electricity price forecasting. Therefore, we elaborate the aspects that render Conformal Prediction worthwhile to know and explain why its simple yet very efficient idea has worked in other fields of application and why its characteristics are promising for short-term power applications as well. We compare its performance with different state-of-the-art electricity price forecasting models such as quantile regression averaging (QRA) in an empirical out-of-sample study for three short-term electricity time series. We combine Conformal Prediction with various underlying point forecast models to demonstrate its versatility and behavior under changing conditions. Our findings suggest that Conformal Prediction yields sharp and reliable prediction intervals in short-term power markets. We further inspect the effect each of Conformal Prediction's model components has and provide a path-based guideline on how to find the best CP model for each market.
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.07886&r=all
  6. By: Reaz Chowdhury; M. Arifur Rahman; M. Sohel Rahman; M. R. C. Mahdy
    Abstract: At present, cryptocurrencies have become a global phenomenon in financial sectors as it is one of the most traded financial instruments worldwide. Cryptocurrency is not only one of the most complicated and abstruse fields among financial instruments, but it is also deemed as a perplexing problem in finance due to its high volatility. This paper makes an attempt to apply machine learning techniques on the index and constituents of cryptocurrency with a goal to predict and forecast prices thereof. In particular, the purpose of this paper is to predict and forecast the close (closing) price of the cryptocurrency index 30 and nine constituents of cryptocurrencies using machine learning algorithms and models so that, it becomes easier for people to trade these currencies. We have used several machine learning techniques and algorithms and compared the models with each other to get the best output. We believe that our work will help reduce the challenges and difficulties faced by people, who invest in cryptocurrencies. Moreover, the obtained results can play a major role in cryptocurrency portfolio management and in observing the fluctuations in the prices of constituents of cryptocurrency market. We have also compared our approach with similar state of the art works from the literature, where machine learning approaches are considered for predicting and forecasting the prices of these currencies. In the sequel, we have found that our best approach presents better and competitive results than the best works from the literature thereby advancing the state of the art. Using such prediction and forecasting methods, people can easily understand the trend and it would be even easier for them to trade in a difficult and challenging financial instrument like cryptocurrency.
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.08444&r=all
  7. By: de Kok, Ties (Tilburg University, School of Economics and Management)
    Abstract: The three essays collected in this PhD thesis concern internal and external reporting practices, narrative disclosures, recent advancements in reporting technologies, and the role of reporting in emerging markets. These essays utilize state-of-the-art empirical techniques drawn from computer science along with new data sources to study fundamental accounting questions. The first essay studies the relationship between reporting frequency and market pressure over social media in crowdfunding markets. The second essay studies the use of soft information in the context of internal bank lending decisions, in particular during a scenario of mandated changes to the location of decisions rights. The third essay studies the information retrieval process for narrative disclosures for users that vary in their financial literacy by combining innovative tracking techniques deployed on Amazon Mechanical Turk with state-of-the-art machine learning techniques.
    Date: 2019
    URL: http://d.repec.org/n?u=RePEc:tiu:tiutis:468fd12b-19c0-4c7b-a33a-6813c55ce950&r=all
  8. By: Wu, Guoyuan; Ye, Fei; Hao, Peng; Esaid, Danial; Boriboonsomsin, Kanok; Barth, Matthew J.
    Abstract: Eco-driving strategies based on connected and automated vehicles (CAV) technology, such as Eco-Approach and Departure (EAD), have attracted significant worldwide interest due to their potential to save energy and reduce tail-pipe emissions. In this project, the research team developed and tested a deep learning–based trajectory-planning algorithm (DLTPA) for EAD. The DLTPA has two processes: offline (training) and online (implementation), and it is composed of two major modules: 1) a solution feasibility checker that identifies whether there is a feasible trajectory subject to all the system constraints, e.g., maximum acceleration or deceleration; and 2) a regressor to predict the speed of the next time-step. Preliminary simulation with microscopic traffic modeling software PTV VISSIM showed that the proposed DLTPA can achieve the optimal solution in terms of energy savings and a greater balance of energy savings vs. computational efforts when compared to the baseline scenarios where no EAD is implemented and the optimal solution (in terms of energy savings) is provided by a graph-based trajectory planning algorithm. View the NCST Project Webpage
    Keywords: Engineering, Eco-driving, deep-learning, energy and emissions, VISSIM
    Date: 2019–05–01
    URL: http://d.repec.org/n?u=RePEc:cdl:itsdav:qt9fz140zt&r=all
  9. By: Sudiksha Joshi
    Abstract: In this research paper, I have performed time series analysis and forecasted the monthly value of housing starts for the year 2019 using several econometric methods - ARIMA(X), VARX, (G)ARCH and machine learning algorithms - artificial neural networks, ridge regression, K-Nearest Neighbors, and support vector regression, and created an ensemble model. The ensemble model stacks the predictions from various individual models, and gives a weighted average of all predictions. The analyses suggest that the ensemble model has performed the best among all the models as the prediction errors are the lowest, while the econometric models have higher error rates.
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.07848&r=all
  10. By: Patel, Abhishek; Anand, Rajesh
    Abstract: This paper investigates the optimal operation of security constraint unit commitment (SCUC) as one of the most important concern in power system operation. SCUC is a mixed integer nonlinear problem (MINLP) which is hard to solve and also the optimal solution is not guarantee. To overcome this drawback, a new evolutionary method known as the chaotic crow search algorithm is developed. The proposed problem includes some significant constraints such as spinning reserve, generators ramp rate, load balance, and power limits. Finally, the proposed method is examined on a 10-unit distribution network. The results show the effectiveness and merit of the proposed technique.
    Keywords: Control and Optimization, Evolutionary Algorithm, Power systems, Reliability, Unit Commitment
    JEL: C0 C3 C8 Z0
    Date: 2019–05–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:93971&r=all
  11. By: Samuel Asante Gyamerah; Philip Ngare; Dennis Ikpe
    Abstract: The effects of weather on agriculture in recent years have become a major concern across the globe. Hence, the need for an effective weather risk management tool (weather derivatives) for agricultural stakeholders. However, most of these stakeholders are unwilling to pay for the price of weather derivatives (WD) because of product-design and geographical basis risks in the pricing models of WD. Using machine learning ensemble technique for crop yield forecasting and feature importance, the major major weather variable (average temperature) that affects crop yields are empirically determined. This variable (average temperature) is used as the underlying index for WD to eliminate product-design basis risks. A model with time-varying speed of mean reversion, seasonal mean, local volatility that depends on the average temperature and time for the contract period is proposed. Based on this model, pricing models for futures, options on futures, and basket futures for cumulative average temperature and growing degree-days are presented. Pricing futures on baskets reduces geographical basis risk as buyer's have the opportunity to select the most appropriate weather stations with their desired weight preference. With these pricing models, agricultural stakeholders can hedge their crops against the perils of weather.
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.07546&r=all
  12. By: Krupitzer, Christian; Drechsel, Guido; Mateja, Deborah; Pollklasener, Alina; Schrage, Florian; Sturm, Timo; Tomasovic, Aleksandar; Becker, Christian
    Abstract: Using rules to capture adaptation knowledge is a common approach for self-adaptive systems. Rule-based reasoning, i.e., using rules to analyze and plan adaptations, has several advantages: (i) it is easy to implement, (ii) it offers fast reasoning, and (iii) it works on resource-spare systems as historical knowledge is not required. Hence, the needed computational power is low and it perfectly suits systems in the pervasive IoT domain. However, the codification of rules poses a challenge to the system design. Existing approaches often require a specific syntax or programming language. Additionally, some approaches force the developer to customize the reasoning mechanism, hence, to reimplement parts of the reasoning. To address these shortcomings, we propose a reusable approach for rule-based reasoning in this paper. Rules can be defined in a spreadsheet without the need to neither learn a syntax nor implement a single line of code. We evaluate the benefits of our approach in two case studies conducted by Master students as well as a quantitative evaluation.
    Date: 2018–03
    URL: http://d.repec.org/n?u=RePEc:dar:wpaper:113225&r=all
  13. By: Arthur Turrell; Bradley J. Speigner; Jyldyz Djumalieva; David Copple; James Thurgood
    Abstract: Using a dataset of 15 million UK job adverts from a recruitment website, we construct new economic statistics measuring labour market demand. These data are ‘naturally occurring’, having originally been posted online by firms. They offer information on two dimensions of vacancies—region and occupation—that firm-based surveys do not usually, and cannot easily, collect. These data do not come with official classification labels so we develop an algorithm which maps the free form text of job descriptions into standard occupational classification codes. The created vacancy statistics give a plausible, granular picture of UK labour demand and permit the analysis of Beveridge curves and mismatch unemployment at the occupational level.
    JEL: E24 J63
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:25837&r=all
  14. By: Seth G. Benzell (Boston University and MIT Initiative on the Digital Economy); Laurence J. Kotlikoff (Boston University, The Gaidar Institute for Economic Policy, and NBER); Guillermo Lagarda (Boston University and Inter-American Development Bank); Yifan Ye (Boston University)
    Abstract: This paper uses the Global Gaidar Model to simulate replacing a territorial corporate income tax with a wealth tax imposed in the form of a destination-based Business Cash Flow Tax. According to the model, the reform produces, over a decade, increases in the capital stock, GDP, and pre-tax wages for high- and low-skilled workers of 20.5 percent, 6.8 percent, 6.3 and 7.5 percent, respectively. Young workers benefit greatly from the change, and welfare loss for retirees is limited. The initially revenue neutral tax reform raises enough additional revenue over time to permit a reduction in personal income tax rates.
    Keywords: Corporate Tax Reform, House Tax Plan, Economic Growth, Business Cash Flow Tax, Computable General Equilibrium, wealth taxation
    JEL: F43 H20 H60
    Date: 2018–11
    URL: http://d.repec.org/n?u=RePEc:bos:iedwpr:dp-312&r=all
  15. By: Ludovic Gouden\`ege; Andrea Molent; Antonino Zanette
    Abstract: In this paper we modify the Gaussian Process Regression Monte Carlo (GPR-MC) method introduced by Gouden\`ege et al. proposing two efficient techniques which allow one to compute the price of American basket options. In particular, we consider basket of assets that follow a Black-Scholes dynamics. The proposed techniques, called GPR Tree (GRP-Tree) and GPR Exact Integration (GPR-EI), are both based on Machine Learning, exploited together with binomial trees or with a closed formula for integration. Moreover, these two methods solve the backward dynamic programming problem considering a Bermudan approximation of the American option. On the exercise dates, the value of the option is first computed as the maximum between the exercise value and the continuation value and then approximated by means of Gaussian Process Regression. Both the two methods derive from the GPR-MC method and they mainly differ in the method used to approximate the continuation value: a single step of binomial tree or integration according to the probability density of the process. Numerical results show that these two methods are accurate and reliable and improve the results of the GPR-MC method in handling American options on very large baskets of assets.
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.09474&r=all
  16. By: Shangeth Rajaa; Jajati Keshari Sahoo
    Abstract: Stock prediction is a topic undergoing intense study for many years. Finance experts and mathematicians have been working on a way to predict the future stock price so as to decide to buy the stock or sell it to make profit. Stock experts or economists, usually analyze on the previous stock values using technical indicators, sentiment analysis etc to predict the future stock price. In recent years, many researches have extensively used machine learning for predicting the stock behaviour. In this paper we propose data driven deep learning approach to predict the future stock value with the previous price with the feature extraction property of convolutional neural network and to use Neural Arithmetic Logic Units with it.
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.07581&r=all
  17. By: Lisa R. Goldberg; Saad Mouti
    Abstract: We use supervised learning to identify factors that predict the cross-section of maximum drawdown for stocks in the US equity market. Our data run from January 1980 to June 2018 and our analysis includes ordinary least squares, penalized linear regressions, tree-based models, and neural networks. We find that the most important predictors tended to be consistent across models, and that non-linear models had better predictive power than linear models. Predictive power was higher in calm periods than stressed periods, and environmental, social, and governance indicators augmented predictive power for non-linear models.
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.05237&r=all
  18. By: Heinisch, Dominik; Koenig, Johannes; Otto, Anne (Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany])
    Abstract: "Only scarce information is available on doctorate recipients' career outcomes in Germany (BuWiN 2013). With the current information base, graduate students cannot make an informed decision whether to start a doctorate (Benderly 2018, Blank 2017). Administrative labour market data could provide the necessary information, is however incomplete in this respect. In this paper, we describe the record linkage of two datasets to close this information gap: data on doctorate recipients collected in the catalogue of the German National Library (DNB), and the German labour market biographies (IEB) from the German Institute of Employment Research. We use a machine learning based methodology, which 1) improves the record linkage of datasets without unique identifiers, and 2) evaluates the quality of the record linkage. The machine learning algorithms are trained on a synthetic training and evaluation dataset. In an exemplary analysis we compare the employment status of female and male doctorate recipients in Germany." (Author's abstract, IAB-Doku) ((en))
    JEL: C81 E24 I20
    Date: 2019–05–21
    URL: http://d.repec.org/n?u=RePEc:iab:iabdpa:201913&r=all
  19. By: Gogas, Periklis (Democritus University of Thrace, Department of Economics); Papadimitriou, Theophilos (Democritus University of Thrace, Department of Economics); Plakandaras, Vasilios (Democritus University of Thrace, Department of Economics); Gupta, Rangan (University of Pretoria)
    Abstract: The difficulty in modelling inflation and the significance in discovering the underlying data generating process of inflation is expressed in an ample literature regarding inflation forecasting. In this paper we evaluate nonlinear machine learning and econometric methodologies in forecasting the U.S. inflation based on autoregressive and structural models of the term structure. We employ two nonlinear methodologies: the econometric Least Absolute Shrinkage and Selection Operator (LASSO) and the machine learning Support Vector Regression (SVR) method. The SVR has never been used before in inflation forecasting considering the term–spread as a regressor. In doing so, we use a long monthly dataset spanning the period 1871:1–2015:3 that covers the entire history of inflation in the U.S. economy. For comparison reasons we also use OLS regression models as benchmark. In order to evaluate the contribution of the term-spread in inflation forecasting in different time periods, we measure the out-of-sample forecasting performance of all models using rolling window regressions. Considering various forecasting horizons, the empirical evidence suggests that the structural models do not outperform the autoregressive ones, regardless of the model’s method. Thus we conclude that the term-spread models are not more accurate than autoregressive ones in inflation forecasting.
    Keywords: U.S. Inflation; forecasting; Support Vector Regression; LASSO
    JEL: C22 C45
    Date: 2019–05–15
    URL: http://d.repec.org/n?u=RePEc:ris:duthrp:2016_003&r=all
  20. By: Berardi, Michele
    Abstract: This paper proposes a novel interpretation of the constant gain learning algorithm through a probabilistic setting with Bayesian updating. Such framework allows to understand the gain coefficient in terms of the probability of changes in the estimated quantity.
    Keywords: Bayesian learning, adaptive learning, constant gain.
    JEL: C63 D83 D84 D90
    Date: 2019–05–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:94023&r=all

General information on the NEP project can be found at https://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.