nep-for New Economics Papers
on Forecasting
Issue of 2022‒09‒12
seven papers chosen by
Rob J Hyndman
Monash University

  1. A State-Space Approach for Time-Series Prediction of an Heterogeneous Agent Model By Filippo Gusella; Giorgio Ricchiuti
  2. Livestock Price Forecasting using Long Short-Term Memory Units: the Case of African Swine Fever and the COVID-19 Pandemic By Yoo, Do-il
  3. Forecasting using Fuzzy Time Series By CHELLAI, Fatih
  4. Can a Machine Correct Option Pricing Models? By Caio Almeida; Jianqing Fan; Gustavo Freire; Francesca Tang
  5. Factor Network Autoregressions By Matteo Barigozzi; Giuseppe Cavaliere; Graziano Moramarco
  6. GAM(L)A: An econometric model for interpretable machine learning By Sullivan Hué
  7. Transformer-Based Deep Learning Model for Stock Price Prediction: A Case Study on Bangladesh Stock Market By Tashreef Muhammad; Anika Bintee Aftab; Md. Mainul Ahsan; Maishameem Meherin Muhu; Muhammad Ibrahim; Shahidul Islam Khan; Mohammad Shafiul Alam

  1. By: Filippo Gusella; Giorgio Ricchiuti
    Abstract: In this paper we apply the state-space model approach to evaluate and compare the forecasting performance of a small-scale heterogeneous agent model (HAM) with fundamentalists and contrarians. As in the tradition of HAMs, agents are heterogeneous in the expectations formation and forecast future prices based on the deviations of previous values with respect to the fundamental value. Moreover, our agents have two specifications for the asset's fundamental value, formalized as a random walk (RW) or with the Gordon model (GM). We examine the models' performance at various forecast horizons (short vs. long horizon) and different frequency-time (monthly and quarterly). Overall, GM statistically outperforms RW specification at the long horizon with statistical significance, while RW and GM are statistically indifferent in the short horizon.
    Keywords: Heterogeneous expectations, forecasting, RW, Gordon model, state-space model
    JEL: C13 C50 G10 G12 G15 E32
    Date: 2022
    URL: http://d.repec.org/n?u=RePEc:frz:wpaper:wp2022_20.rdf&r=
  2. By: Yoo, Do-il
    Keywords: Research Methods/Statistical Methods, Food Consumption/Nutrition/Food Safety, Marketing
    Date: 2022–08
    URL: http://d.repec.org/n?u=RePEc:ags:aaea22:322610&r=
  3. By: CHELLAI, Fatih
    Abstract: This chapter is a very short introduction to Fuzzy Time Series (FTS) models. The aim is to present an overview of the concepts of fuzzy logic, fuzzy set theory, and fuzzy time series framework. Accordingly, the chapter has a full application dimension of the FTS models as a main vocation. The R program was used to fit and forecast the principal FTS models, where real datasets of road traffic accidents in Algeria have been used. This chapter is organized as follows; the first section presents the concept of fuzzy logic, the second section is devoted to the Fuzzy Time Series, where we define a fuzzy set and universe of discourse. The third section summarizes the main models of fuzzy time series, precisely; we presented the (Song & Chissom, 1993) model, the (Chen, 1996) model, the Heuristic (Huarng, 2001) model, the (Abbasov & Mamedova, 2003) model, the (Chen & Hsu, 2004) model, and the (Singh, 2008) model. The fourth section is a case application of these models on the number of accidents in Algeria; the “AnalyzeTS” package of the R program was used to demonstrate the steps of estimation and forecasting.
    Keywords: Fuzzy logic; Forecasting; Time Series
    JEL: C1 C22 C4 C87
    Date: 2022–07–19
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:113848&r=
  4. By: Caio Almeida (Princeton University); Jianqing Fan (Princeton University); Gustavo Freire (Erasmus School of Economics); Francesca Tang (Princeton University)
    Abstract: We introduce a novel two-step approach to predict implied volatility surfaces. Given any fitted parametric option pricing model, we train a feedforward neural network on the model-implied pricing errors to correct for mispricing and boost performance. Using a large dataset of S&P 500 options, we test our nonparametric correction on several parametric models ranging from ad-hoc Black-Scholes to structural stochastic volatility models and demonstrate the boosted performance for each model. Out-of-sample prediction exercises in the cross-section and in the option panel show that machine-corrected models always outperform their respective original ones, often by a large extent. Our method is relatively indiscriminate, bringing pricing errors down to a similar magnitude regardless of the misspecification of the original parametric model. Even so, correcting models that are less misspecified usually leads to additional improvements in performance and also outperforms a neural network fitted directly to the implied volatility surface.
    Keywords: Deep Learning, Boosting, Implied Volatility, Stochastic Volatility, Model Correction
    JEL: C45 C58 G13
    Date: 2022–07
    URL: http://d.repec.org/n?u=RePEc:pri:econom:2022-9&r=
  5. By: Matteo Barigozzi; Giuseppe Cavaliere; Graziano Moramarco
    Abstract: We propose a factor network autoregressive (FNAR) model for time series with complex network structures. The coefficients of the model reflect many different types of connections between economic agents ("multilayer network"), which are summarized into a smaller number of network matrices ("network factors") through a novel tensor-based principal component approach. We provide consistency results for the estimation of the factors and the coefficients of the FNAR. Our approach combines two different dimension-reduction techniques and can be applied to ultra-high dimensional datasets. In an empirical application, we use the FNAR to investigate the cross-country interdependence of GDP growth rates based on a variety of international trade and financial linkages. The model provides a rich characterization of macroeconomic network effects and exhibits good forecast performance compared to popular dimension-reduction methods.
    Date: 2022–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2208.02925&r=
  6. By: Sullivan Hué (Aix-Marseille Université, AMSE)
    Abstract: Despite their high predictive performance, random forest and gradient boosting are often considered as black boxes or uninterpretable models, which has raised concerns from practitioners and regulators. As an alternative, I propose to use partial linear models that are inherently interpretable. Specifically, this presentation introduces GAM-lasso (GAMLA) and GAM-autometrics (GAMA), denoted as GAM(L)A in short. GAM(L)A combines parametric and non-parametric functions to accurately capture linearities and nonlinearities prevailing between dependent and explanatory variables and a variable-selection procedure to control for overfitting issues. Estimation relies on a two-step procedure building upon the double residual method. I illustrate the predictive performance and interpretability of GAM(L)A on a regression and a classification problem. The results show that GAM(L)A outperforms parametric models augmented by quadratic, cubic, and interaction effects. Moreover, the results also suggest that the performance of GAM(L)A is not significantly different from that of random forest and gradient boosting.
    Date: 2022–08–01
    URL: http://d.repec.org/n?u=RePEc:boc:fsug22:19&r=
  7. By: Tashreef Muhammad; Anika Bintee Aftab; Md. Mainul Ahsan; Maishameem Meherin Muhu; Muhammad Ibrahim; Shahidul Islam Khan; Mohammad Shafiul Alam
    Abstract: In modern capital market the price of a stock is often considered to be highly volatile and unpredictable because of various social, financial, political and other dynamic factors. With calculated and thoughtful investment, stock market can ensure a handsome profit with minimal capital investment, while incorrect prediction can easily bring catastrophic financial loss to the investors. This paper introduces the application of a recently introduced machine learning model - the Transformer model, to predict the future price of stocks of Dhaka Stock Exchange (DSE), the leading stock exchange in Bangladesh. The transformer model has been widely leveraged for natural language processing and computer vision tasks, but, to the best of our knowledge, has never been used for stock price prediction task at DSE. Recently the introduction of time2vec encoding to represent the time series features has made it possible to employ the transformer model for the stock price prediction. This paper concentrates on the application of transformer-based model to predict the price movement of eight specific stocks listed in DSE based on their historical daily and weekly data. Our experiments demonstrate promising results and acceptable root mean squared error on most of the stocks.
    Date: 2022–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2208.08300&r=

This nep-for issue is ©2022 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.