nep-for New Economics Papers
on Forecasting
Issue of 2019‒10‒28
ten papers chosen by
Rob J Hyndman
Monash University

  1. Forecasting Swiss Exports using Bayesian Forecast Reconciliation By Florian Eckert; Rob Hyndman; Anastasios Panagiotelis
  2. Feature-based Forecast-Model Performance Prediction By Thiyanga S. Talagala; Feng Li; Yanfei Kang
  3. Forecast Reconciliation: A geometric View with New Insights on Bias Correction By Anastasios Panagiotelis; Puwasala Gamakumara; George Athanasopoulos; Rob J Hyndman
  4. Predicting recessions: financial cycle versus term spread By Claudio Borio; Mathias Drehmann; Dora Xia Author-X-Name_First: Dora
  5. Forecasting Observables with Particle Filters: Any Filter Will Do! By Patrick Leung; Catherine S. Forbes; Gael M Martin; Brendan McCabe
  6. Microfounded forecasting By Gaglianone, Wagner Piazza; Issler, João Victor
  7. How is Machine Learning Useful for Macroeconomic Forecasting? By Philippe Goulet Coulombe; Maxime Leroux; Dalibor Stevanovic; Stéphane Surprenant
  8. Fast and Flexible Bayesian Inference in Time-varying Parameter Regression Models By Niko Hauzenberger; Florian Huber; Gary Koop; Luca Onorante
  9. BVAR: Bayesian Vector Autoregressions with Hierarchical Prior Selection in R By Nikolas Kuschnig; Lukas Vashold
  10. Forecasting under Long Memory and Nonstationarity By Uwe Hassler; Marc-Oliver Pohle

  1. By: Florian Eckert (KOF Swiss Economic Institute, ETH Zurich, Switzerland); Rob Hyndman (Department of Econometrics & Business Statistics, Monash University, Australia); Anastasios Panagiotelis (Department of Econometrics & Business Statistics, Monash University, Australia)
    Abstract: This paper conducts an extensive forecasting study on 13,118 time series measuring Swiss goods exports, grouped hierarchically by export destination and product category. We apply existing state of the art methods in forecast reconciliation and introduce a novel Bayesian reconciliation framework. This approach allows for explicit estimation of reconciliation biases, leading to several innovations: Prior judgment can be used to assign weights to specific forecasts and the occurrence of negative reconciled forecasts can be ruled out. Overall we find strong evidence that in addition to producing coherent forecasts, reconciliation also leads to improvements in forecast accuracy.
    Keywords: Hierarchical Forecasting, Bayesian Forecast Reconciliation, Swiss Exports, Optimal Forecast Combination.
    JEL: C32 C53 E17
    Date: 2019–07
  2. By: Thiyanga S. Talagala; Feng Li; Yanfei Kang
    Abstract: This paper introduces a novel meta-learning algorithm for time series forecasting. The efficient Bayesian multivariate surface regression approach is used to model forecast error as a function of features calculated from the time series. The minimum predicted forecast error is then used to identify an individual model or combination of models to produce forecasts. In general, the performance of any meta-learner strongly depends on the reference dataset used to train the model. We further examine the feasibility of using GRATIS (a feature-based time series simulation approach) in generating a realistic time series collection to obtain a diverse collection of time series for our reference set. The proposed framework is tested using the M4 competition data and is compared against several benchmarks and other commonly used forecasting approaches. The new approach obtains performance comparable to the second and the third rankings of the M4 competition.
    Keywords: tme series, meta-learning, mixture autoregressive models, surface regression, M4 competition
    JEL: C10 C14 C22
    Date: 2019
  3. By: Anastasios Panagiotelis; Puwasala Gamakumara; George Athanasopoulos; Rob J Hyndman
    Abstract: A geometric interpretation is developed for so-called reconciliation methodologies used to forecast time series that adhere to known linear constraints. In particular, a general framework is established nesting many existing popular reconciliation methods within the class of projections. This interpretation facilitates the derivation of novel results that explain why and how reconciliation via projection is guaranteed to improve forecast accuracy with respect to a specific class of loss functions. The result is also demonstrated empirically. The geometric interpretation is further used to provide a new proof that forecast reconciliation results in unbiased forecasts provided the initial base forecasts are also unbiased. Approaches for dealing with biased base forecasts are proposed and explored in an extensive empirical study on Australian tourism flows. Overall, the method of bias-correcting before carrying out reconciliation is shown to outperform alternatives that only bias-correct or only reconcile forecasts.
    Date: 2019
  4. By: Claudio Borio; Mathias Drehmann; Dora Xia Author-X-Name_First: Dora
    Abstract: Financial cycles can be important drivers of real activity, but there is scant evidence about how well they signal recession risks. We run a horse race between the term spread - the most widely used indicator in the literature - and a range of financial cycle measures. Unlike most papers, ours assesses forecasting performance not just for the United States but also for a panel of advanced and emerging market economies. We find that financial cycle measures have significant forecasting power both in and out of sample, even for a three-year horizon. Moreover, they outperform the term spread in nearly all specifications. These results are robust to different recession specifications.
    Keywords: financial cycle, term spread, recession risk, panel probit mode
    JEL: C33 E37 E44
    Date: 2019–10
  5. By: Patrick Leung; Catherine S. Forbes; Gael M Martin; Brendan McCabe
    Abstract: We investigate the impact of filter choice on forecast accuracy in state space models. The filters are used both to estimate the posterior distribution of the parameters, via a particle marginal Metropolis-Hastings (PMMH) algorithm, and to produce draws from the filtered distribution of the final state. Multiple filters are entertained, including two new data-driven methods. Simulation exercises are used to document the performance of each PMMH algorithm, in terms of computation time and the efficiency of the chain. We then produce the forecast distributions for the one-stepahead value of the observed variable, using a fixed number of particles and Markov chain draws. Despite distinct differences in efficiency, the filters yield virtually identical forecasting accuracy, with this result holding under both correct and incorrect specification of the model. This invariance of forecast performance to the specification of the filter also characterizes an empirical analysis of S&P500 daily returns.
    Keywords: Bayesian prediction, particle MCMC; non-Gaussian time series, state space models, unbiased likelihood estimation, sequential Monte Carlo.
    JEL: C11 C22 C58
    Date: 2019
  6. By: Gaglianone, Wagner Piazza; Issler, João Victor
    Abstract: This paper proposes a Önancial approach to economic forecasting which can be applied to data bases of surveys of forecasts. We model the forecasting decision of an individual from Örst principles (i.e., microfounded) and show that surveys of forecasts obey an a¢ ne factor structure with a single factor which is the conditional expectation of the target variable based on common information (public and private). This holds in a context where individuals have access to public information and also have access to private information with common and idiosyncratic components. We show that asymptotically e¢ cient forecasts of the target variable can be built using the generalized method of moments in a panel-data context, when N and T diverge or when T diverges with N Öxed. In this context, the optimal forecast is a function of the consensus forecast of the survey (a cross-sectional average of survey forecasts) after appropriately Öltering out two bias terms. This links the Önancial approach of economic forecasting to the forecast-combination literature, where idiosyncratic risk of individual forecasts can be diversiÖed out. Our microfounded approach is applied to a world-class data base on surveys of expectations and the techniques advanced here fare best when compared with competitive alternatives.
    Date: 2019–10–15
  7. By: Philippe Goulet Coulombe; Maxime Leroux; Dalibor Stevanovic; Stéphane Surprenant
    Abstract: We move beyond Is Machine Learning Useful for Macroeconomic Forecasting? by adding the how. The current forecasting literature has focused on matching specific variables and horizons with a particularly successful algorithm. To the contrary, we study a wide range of horizons and variables and learn about the usefulness of the underlying features driving ML gains over standard macroeconometric methods. We distinguish 4 so-called features (nonlinearities, regularization, cross-validation and alternative loss function) and study their behavior in both the data-rich and data-poor environments. To do so, we carefully design a series of experiments that easily allow to identify the “treatment” effects of interest. We conclude that (i) more data and nonlinearities are true game-changers for macroeconomic prediction, (ii) the standard factor model remains the best regularization, (iii) cross-validations are not all made equal (but K-fold is as good as BIC) and (iv) one should stick with the standard L2 loss. The forecasting gains of nonlinear techniques are associated with high macroeconomic uncertainty, financial stress and housing bubble bursts. This suggests that Machine Learning is useful for macroeconomic forecasting by mostly capturing important nonlinearities that arise in the context of uncertainty and financial frictions.
    Keywords: Machine Learning,Big Data,Forecasting,
    JEL: C53 C55 E37
    Date: 2019–10–17
  8. By: Niko Hauzenberger; Florian Huber; Gary Koop; Luca Onorante
    Abstract: In this paper, we write the time-varying parameter regression model involving K explanatory variables and T observations as a constant coefficient regression model with TK explanatory variables. In contrast with much of the existing literature which assumes coefficients to evolve according to a random walk, this specification does not restrict the form that the time-variation in coefficients can take. We develop computationally efficient Bayesian econometric methods based on the singular value decomposition of the TK regressors. In artificial data, we find our methods to be accurate and much faster than standard approaches in terms of computation time. In an empirical exercise involving inflation forecasting using a large number of predictors, we find our methods to forecast better than alternative approaches and document different patterns of parameter change than are found with approaches which assume random walk evolution of parameters.
    Date: 2019–10
  9. By: Nikolas Kuschnig (Vienna University of Economics and Business, Institute of Ecological Economics); Lukas Vashold (Vienna University of Economics and Business, Department of Economics)
    Abstract: Vector autoregression (VAR) models are widely used models for multivariate time series analysis, but often suffer from their dense parameterization. Bayesian methods are commonly employed as a remedy by imposing shrinkage on the model coefficients via informative priors, thereby reducing parameter uncertainty. The subjective choice of the informativeness of these priors is often criticized and can be alleviated via hierarchical modeling. This paper introduces BVAR, an R package dedicated to the estimation of Bayesian VAR models in a hierarchical fashion. It incorporates functionalities that permit addressing a wide range of research problems while retaining an easy-to-use and transparent interface. It features the most commonly used priors in the context of multivariate time series analysis as well as an extensive set of standard methods for analysis. Further functionalities include a framework for defining custom dummy-observation priors, the computation of impulse response functions, forecast error variance decompositions and forecasts.
    Keywords: Vector autoregression, VAR, Bayesian, multivariate, hierarchical, R, package
    JEL: C87 C30 C11
    Date: 2019–10
  10. By: Uwe Hassler; Marc-Oliver Pohle
    Abstract: Long memory in the sense of slowly decaying autocorrelations is a stylized fact in many time series from economics and finance. The fractionally integrated process is the workhorse model for the analysis of these time series. Nevertheless, there is mixed evidence in the literature concerning its usefulness for forecasting and how forecasting based on it should be implemented. Employing pseudo-out-of-sample forecasting on inflation and realized volatility time series and simulations we show that methods based on fractional integration clearly are superior to alternative methods not accounting for long memory, including autoregressions and exponential smoothing. Our proposal of choosing a fixed fractional integration parameter of $d=0.5$ a priori yields the best results overall, capturing long memory behavior, but overcoming the deficiencies of methods using an estimated parameter. Regarding the implementation of forecasting methods based on fractional integration, we use simulations to compare local and global semiparametric and parametric estimators of the long memory parameter from the Whittle family and provide asymptotic theory backed up by simulations to compare different mean estimators. Both of these analyses lead to new results, which are also of interest outside the realm of forecasting.
    Date: 2019–10

This nep-for issue is ©2019 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.