nep-for New Economics Papers
on Forecasting
Issue of 2019‒09‒23
eleven papers chosen by
Rob J Hyndman
Monash University

  1. A Bayesian VAR Approach to Short-Term Inflation Forecasting By Fethi Ogunc
  2. Selecting a Model for Forecasting By Jennifer Castle; Jurgen Doornik; David Hendry
  3. Assessing forecast gains from ‘deep learning’ over time-series methodologies By Yi Wu; Sotiris Tsolacos
  4. Predicting disaggregated tourist arrivals in Sierra Leone using ARIMA model By Jackson, Emerson Abraham; Tamuke, Edmund
  5. Long-Term Economic Forecasting with Structured Analogies and Interaction Groups By Konstantinos Nikolopoulos; Waleed S. Alghassab; Konstantia Litsiou; Stelios Sapountzis
  6. Comparing the forecasting of cryptocurrencies by Bayesian time-varying volatility models By Rick Bohte; Luca Rossini
  7. High-dimensional macroeconomic forecasting using message passing algorithms By Korobilis, Dimitris
  8. Modelling and forecasting the dollar-pound exchange rate in the presence of structural breaks By Jennifer Castle; Takamitsu Kurita
  9. A Deep Learning Framework for Pricing Financial Instruments By Qiong Wu; Zheng Zhang; Andrea Pizzoferrato; Mihai Cucuringu; Zhenming Liu
  10. A risk-mitigation model driven from the level of forecastability of Black Swans: prepare and respond to major Earthquakes through a dynamic Temporal and Spatial Aggregation forecasting framework By Konstantinos Nikolopoulos; Fotios Petropoulos; Vasco Sanchez Rodrigues; Stephen Pettit; Anthony Beresford
  11. Inattention, Disagreement and Internal (In)Consistency of Inflation Forecasts By Fernando Borraz; Laura Zacheo

  1. By: Fethi Ogunc
    Abstract: In this paper, we discuss the forecasting performance of Bayesian vector autoregression (BVAR) models for inflation under alternative specifications. In particular, we consider modelling in levels or in differences; choice of tightness; estimating BVARs of different model sizes and the accuracy of conditional and unconditional forecasts. Our empirical results point out that BVAR forecasts using variables in log-difference form outperform the ones using log-levels of the data. When we evaluate forecast performance in terms of model size, the lowest forecast errors belong to the models having relatively small number of variables, though we find only small difference in forecast accuracy among models of various sizes up to two quarter ahead. Finally, the conditioning seems to help to forecast inflation. Overall, pseudo evaluation findings suggest that small to medium size BVAR models having wisely selected variables in difference form and conditioning on the future paths of some variables appear to be a good choice to forecast inflation in Turkey.
    Keywords: Inflation, Forecasting, Bayesian vector autoregression, Turkey
    JEL: C51 C52 E37
    Date: 2019
  2. By: Jennifer Castle; Jurgen Doornik; David Hendry
    Abstract: Jennifer L. Castle, Jurgen A. Doornik and David F. Hendry We investigate the role of the significance level when selecting models for forecasting as it con-trols both the null retention frequency and the probability of retaining relevant variables when using binary decisions to retain or drop variables. Analysis identifies the best selection significance level in a bivariate model when there are location shifts at or near the forecast origin. The trade-off for select¬ing variables in forecasting models in a stationary world, namely that variables should be retained if their non-centralities exceed 1, applies in the wide-sense non-stationary settings with structural breaks examined here. The results confirm the optimality of the Akaike Information Criterion for forecasting in completely different settings than initially derived. An empirical illustration forecast¬ing UK inflation demonstrates the applicability of the analytics. Simulation then explores the choice of selection significance level for 1-step ahead forecasts in larger models when there are unknown lo¬cation shifts present under a range of alternative scenarios, using the multipath tree search algorithm, Autometrics (Doornik, 2009), varying the target significance level for the selection of regressors. The costs of model selection are shown to be small. The results provide support for model selection at looser than conventional settings, albeit with many additional features explaining the forecast perfor¬mance, with the caveat that retaining irrelevant variables that are subject to location shifts can worsen forecast performance.
    Keywords: Model selection; forecasting; location shifts; significance level; Autometrics
    Date: 2018–11–09
  3. By: Yi Wu; Sotiris Tsolacos
    Abstract: There is a plethora of standard time series techniques for time series forecasting including ARIMA, ARIMAX, Spectral Analysis and Decomposition. A requirement for the application of these techniques is some degree of correlation in the series (eg the AR terms) and past effects from innovations. These properties imply that each observation is partially predictable from previous observations, from previous random spikes, or from both. An obvious assumption made is that the correlations inherent in the data set have been adequately modeled. Thus after a model has been built, any leftover variations (residuals) are considered i.i.d, independent and normally distributed with mean zero and constant variance over time. There is no further information from the residuals that can be used in the model. Implicit in these techniques is the notion that existing patterns in the time series will continue into the future. These standard techniques work well for short-term prediction, but do not prove to be effective in capturing the characteristics of data in longer period. ARIMA for instance gives more importance to immediate data points in the test set and tries to perform well for them but as we get far we see a larger variance in the predicted output. Due to the dynamic nature of the time series data often these assumptions are not met when there is non-linear autocorrelation in the series. Non-linearities in the data can be efficiently addressed with Deep Learning Techniques. Time series data are often subject to sequence dependence problem, which Deep Learning Techniques such as RNN can resolve as they are adaptive in nature. Other variants of Deep Learning such as LSTM (Long Short Term Memory) and GRU (Gated Recurrent Units) which can easily be trained based on long-term period to pick up the true dynamics of series and achieve better modeling and forecast results. Investors and real estate analysts are increasingly coming across of Deep Learning methods for market analysis and portfolio construction. We investigate potential forecast gains arising from the adoption of these models over conventional time-series models. We make use of the monthly data-series at the city level in the Europe produced by RCA. We are interested both in directional and point forecasts. The forecast evaluation takes place over different time horizons and with the application of conventional forecast assessment metrics including Diebold Mariano.
    Keywords: Artificial Intelligence; deep learning; forecast gains
    JEL: R3
    Date: 2019–01–01
  4. By: Jackson, Emerson Abraham; Tamuke, Edmund
    Abstract: This study have uniquely mad use of Box-Jenkins ARIMA models to address the core of the threes objectives set out in view of the focus to add meaningful value to knowledge exploration. The outcome of the research have testify the achievements of this through successful nine months out-of-sample forecasts produced from the program codes, with indicating best model choices from the empirical estimation. In addition, the results also provide description of risks produced from the uncertainty Fan Chart, which clearly outlined possible downside and upside risks to tourist visitations in the country. In the conclusion, it was suggested that downside risks to the low level tourist arrival can be managed through collaboration between authorities concerned with the management of tourist arrivals in the country.
    Keywords: ARIMA Methodology,Out-of-Sample Forecast,Tourist Arrivals,Sierra Leone
    JEL: C32 C52 C53 L83
    Date: 2019
  5. By: Konstantinos Nikolopoulos (Bangor University); Waleed S. Alghassab (University of Hail, Saudi Arabia); Konstantia Litsiou (Manchester Metropolitan University); Stelios Sapountzis (Salford Business School & Manchester Metropolitan University)
    Abstract: This paper explores the potential of long-term economic forecasting with judgmental methods: semi-Structured Analogies(SA) and Interaction Groups (IG). The case study is Saudi Arabia and its aim to adopt a diversification strategy to reduce its dependency on the oil sector, where oil revenue consists 90% of its budget currently. The study has four phases: Unaided Judgment, Structured Analogies, and Interaction Groups with Structured Analogies - all three using disguised data – before finally working on the undisguised case study under review over a significant amount of time. Adopting judgmental methods are attributed to three main reasons: in an attempt to derive long-term economic forecasts about Saudi Arabia’s ability to diversify its investments, to discover the impact of different factors on financial and economic outlooks, and to explore the main reasons for deviating the accuracy of financial and economic forecasts
    Keywords: Foresight, Economic Forecasting, Structured Analogies, Interaction Groups, Gross Domestic Product (GDP)
    Date: 2019–08
  6. By: Rick Bohte; Luca Rossini
    Abstract: This paper studies the forecasting ability of cryptocurrency time series. This study is about the four most capitalized cryptocurrencies: Bitcoin, Ethereum, Litecoin and Ripple. Different Bayesian models are compared, including models with constant and time-varying volatility, such as stochastic volatility and GARCH. Moreover, some crypto-predictors are included in the analysis, such as S\&P 500 and Nikkei 225. In this paper the results show that stochastic volatility is significantly outperforming the benchmark of VAR in both point and density forecasting. Using a different type of distribution, for the errors of the stochastic volatility the student-t distribution came out to be outperforming the standard normal approach.
    Date: 2019–09
  7. By: Korobilis, Dimitris
    Abstract: This paper proposes two distinct contributions to econometric analysis of large information sets and structural instabilities. First, it treats a regression model with time-varying coefficients, stochastic volatility and exogenous predictors, as an equivalent high-dimensional static regression problem with thousands of covariates. Inference in this specification proceeds using Bayesian hierarchical priors that shrink the high-dimensional vector of coefficients either towards zero or time-invariance. Second, it introduces the frameworks of factor graphs and message passing as a means of designing efficient Bayesian estimation algorithms. In particular, a Generalized Approximate Message Passing (GAMP) algorithm is derived that has low algorithmic complexity and is trivially parallelizable. The result is a comprehensive methodology that can be used to estimate time-varying parameter regressions with arbitrarily large number of exogenous predictors. In a forecasting exercise for U.S. price inflation this methodology is shown to work very well.
    Keywords: high-dimensional inference; factor graph; Belief Propagation; Bayesian shrinkage; time-varying parameter model
    JEL: C01 C11 C13 C52 C53 C61 E31 E37
    Date: 2019–09–15
  8. By: Jennifer Castle; Takamitsu Kurita
    Abstract: We employ a newly-developed partial cointegration system allowing for level shifts to examine whether economic fundamentals form the long-run determinants of the dollar-pound exchange rate in an era of structural change. The paper uncovers a class of local data generation mechanisms underlying long-run and short-run dynamic features of the exchange rate using a set of economic variables that explicitly reflect the central banks’ monetary policy stances and the influence of a forward exchange market. The impact of the Brexit referendum is evaluated by examining forecasts when the dollar-pound exchange rate fell substantially around the vote.
    Keywords: Exchange rates, Monetary policy, General-to-speciï¬ c approach, Partial cointegrated vector autoregressive models, Structural breaks.
    JEL: C22 C32 C52 F31
    Date: 2019–01–07
  9. By: Qiong Wu; Zheng Zhang; Andrea Pizzoferrato; Mihai Cucuringu; Zhenming Liu
    Abstract: We propose an integrated deep learning architecture for the stock movement prediction. Our architecture simultaneously leverages all available alpha sources. The sources include technical signals, financial news signals, and cross-sectional signals. Our architecture possesses three main properties. First, our architecture eludes overfitting issues. Although we consume a large number of technical signals but has better generalization properties than linear models. Second, our model effectively captures the interactions between signals from different categories. Third, our architecture has low computation cost. We design a graph-based component that extracts cross-sectional interactions which circumvents usage of SVD that's needed in standard models. Experimental results on the real-world stock market show that our approach outperforms the existing baselines. Meanwhile, the results from different trading simulators demonstrate that we can effectively monetize the signals.
    Date: 2019–09
  10. By: Konstantinos Nikolopoulos (Bangor University); Fotios Petropoulos (University of Bath); Vasco Sanchez Rodrigues (Cardiff University); Stephen Pettit (Cardiff University); Anthony Beresford (Cardiff University)
    Abstract: Major earthquakes are black swan, or quasi-random, events capable of disrupting supply chains to an entire country, region or even the whole world as the case of the Fukushima disaster profoundly demonstrated. They are amongst the most unpredictable types of natural disasters, and can have a severe impact on supply chains and distribution networks. This research develops a supply chain risk management model in the anticipation of such a black swan event. The research considers major earthquake data for the period 1985 – 2014, and temporal as well as spatial aggregation is undertaken. The aim is to identify the optimum grid size where forecasting variance is minimized and forecastability is maximized. Building on that a risk-mitigation model is developed. The dynamic model – updated every time a new event is added in the database - includes preparedness, responsiveness and centralization strategies for the different levels of time and geographical aggregation.
    Keywords: Risk, Black Swans, Forecastability, Statistical Aggregation, Disaster Relief;
    Date: 2019–08
  11. By: Fernando Borraz (Banco Central del Uruguay; Departamento de Economía. Facultad de Ciencias Sociales. Universidad de la República (Uruguay)); Laura Zacheo (Banco Central del Uruguay)
    Abstract: This paper uses a rich and unique data set with eight years of monthly inflation expectation and subjective probability distributions to analyze the expectation formation process of firms in Uruguay. First, firms exhibit a very high degree of attention to current inflation conditions which we link to the countries' historical inflation experience. Second, the forecasters fail to incorporate all of the available information and firms' forecasts are more accurate than those of professional forecasters in Uruguay. Third, there is disagreement between forecasters at the short run but also at the long run and the disagreement is higher for forecasters that revise than for forecasters that do not revise. Therefore, there must be some noise or friction that prevents agents that changes prices to get access to perfect information. Fourth, the disagreement is not fully explained by differences in the information set because one in five forecasts is not internal consistent.
    Abstract: Este documento analiza el proceso de formación de expectativas de las empresas en Uruguay examinando una base de datos única, con ocho años de expectativas de inflación mensuales y distribuciones de probabilidad subjetiva. En primer lugar, las empresas muestran un alto grado de atención a las condiciones inflacionarias actuales, aspecto que vinculamos con la experiencia histórica de inflación del país. En segundo lugar, al realizar proyecciones los agentes no incorporan toda la información disponible; además, las proyecciones de las empresas son más precisas que las de los analistas profesionales en Uruguay. En tercer lugar, existe desacuerdo entre las empresas respecto a sus proyecciones a corto plazo pero también a largo plazo y el desacuerdo es mayor para aquellas que revisan sus proyecciones que para las que no revisan. Por lo tanto, debe haber algún ruido o fricción que impida a los agentes que modifican los precios obtener acceso a la información perfecta. Cuarto, el desacuerdo no es explicado completamente por las diferencias en el conjunto de información porque una de cada cinco proyecciones de inflación no es consistente internamente.
    Keywords: inflation expectation, inattention, disagreement, subjective probability distribution
    JEL: D84 E31 E58
    Date: 2018

This nep-for issue is ©2019 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.