nep-for New Economics Papers
on Forecasting
Issue of 2019‒02‒18
eight papers chosen by
Rob J Hyndman
Monash University

  1. Cross-temporal aggregation: Improving the forecast accuracy of hierarchical electricity consumption By Spiliotis, Evangelos; Petropoulos, Fotios; Kourentzes, Nikolaos; Assimakopoulos, Vassilios
  2. Testing for Changes in Forecasting Performance By Pierre Perron; Yohei Yamamoto
  3. Multivariate Bayesian Predictive Synthesis in Macroeconomic Forecasting By Kenichiro McAlinn; Knut Are Aastveit; Jouchi Nakajima; Mike West
  4. A Horse Race in High Dimensional Space By Paolo Andreini; Donato Ceci
  5. Forecasting Imports with Information from Abroad By Christian Grimme; Robert Lehmann; Marvin Noeller
  6. Improved methods for combining point forecasts for an asymmetrically distributed variable By Ozer Karagedikli; Shaun P. Vahey; Elizabeth C. Wakerly
  7. Density Forecasting By Federico Bassetti; Roberto Casarin; Francesco Ravazzolo
  8. Budgetary Traffic Lights By Matus Kubik; Pavol Majher

  1. By: Spiliotis, Evangelos; Petropoulos, Fotios; Kourentzes, Nikolaos; Assimakopoulos, Vassilios
    Abstract: Achieving high accuracy in load forecasting requires the selection of appropriate forecasting models, able to capture the special characteristics of energy consumption time series. When hierarchies of load from different sources are considered together, the complexity increases further; for example, when forecasting both at system and region level. Not only the model selection problem is expanded to multiple time series, but we also require aggregation consistency of the forecasts across levels. Although hierarchical forecast can address the aggregation consistency concerns, it does not resolve the model selection uncertainty. To address this we rely on Multiple Temporal Aggregation, which has been shown to mitigate the model selection problem for low frequency time series. We propose a modification for high frequency time series and combine conventional cross-sectional hierarchical forecasting with multiple temporal aggregation. The effect of incorporating temporal aggregation in hierarchical forecasting is empirically assessed using a real data set from five bank branches, demonstrating superior accuracy, aggregation consistency and reliable automatic forecasting.
    Keywords: Temporal aggregation; Hierarchical forecasting; Electricity load; Exponential smoothing; MAPA
    JEL: C4 C53 D8 D81 L94
    Date: 2018–07
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:91762&r=all
  2. By: Pierre Perron (Boston University); Yohei Yamamoto (Hitotsubashi University)
    Abstract: We consider the issue of forecast failure (or breakdown) and propose methods to assess retrospectively whether a given forecasting model provides forecasts which show evidence of changes with respect to some loss function. We adapt the classical structural change tests to the forecast failure context. First, we recommend that all tests should be carried with a fixed scheme to have best power. This ensures a maximum difference between the fitted in and out-of-sample means of the losses and avoids contamination issues under the rolling and recursive schemes. With a fixed scheme, Giacomini and Rossi’s (2009) (GR) test is simply a Wald test for a one-time change in the mean of the total (the in-sample plus out-of-sample) losses at a known break date, say m, the value that separates the in and out-of-sample periods. To alleviate this problem, we consider a variety of tests: maximizing the GR test over values of m within a pre-specified range; a Double sup-Wald (DSW) test which for each m performs a sup-Wald test for a change in the mean of the out-of-sample losses and takes the maximum of such tests over some range; we also propose to work directly with the total loss series to define the Total Loss sup-Wald (TLSW) and Total Loss UDmax (TLUD) tests. Using theoretical analyses and simulations, we show that with forecasting models potentially involving lagged dependent variables, the only tests having a monotonic power function for all data-generating processes considered are the DSW and TLUD tests, constructed with a fixed forecasting window scheme. Some explanations are provided and empirical applications illustrate the relevance of our findings in practice.
    Keywords: forecast breakdown, non-monotonic power, structural change, out-of-sample forecast
    JEL: C14 C22
    Date: 2018–05
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2019-003&r=all
  3. By: Kenichiro McAlinn; Knut Are Aastveit; Jouchi Nakajima; Mike West
    Abstract: We present new methodology and a case study in use of a class of Bayesian predictive synthesis (BPS) models for multivariate time series forecasting. This extends the foundational BPS framework to the multivariate setting, with detailed application in the topical and challenging context of multi-step macroeconomic forecasting in a monetary policy setting. BPS evaluates– sequentially and adaptively over time– varying forecast biases and facets of miscalibration of individual forecast densities for multiple time series, and– critically– their time-varying interdependencies. We define BPS methodology for a new class of dynamic multivariate latent factor models implied by BPS theory. Structured dynamic latent factor BPS is here motivated by the application context– sequential forecasting of multiple US macroeconomic time series with forecasts generated from several traditional econometric time series models. The case study highlights the potential of BPS to improve of forecasts of multiple series at multiple forecast horizons, and its use in learning dynamic relationships among forecasting models or agents.
    Keywords: Agent opinion analysis, Bayesian forecasting, Dynamic latent factors models, Dynamic SURE models, Macroeconomic forecasting, Multivariate density forecast combination,
    Date: 2019–01
    URL: http://d.repec.org/n?u=RePEc:bny:wpaper:0073&r=all
  4. By: Paolo Andreini (University of Rome "Tor Vergata"); Donato Ceci (University of Rome "Tor Vergata" & Bank of Italy)
    Abstract: In this paper, we study the predictive power of dense and sparse estimators in a high dimensional space. We propose a new forecasting method, called Elastically Weighted Principal Components Analysis (EWPCA) that selects the variables, with respect to the target variable, taking into account the collinearity among the data using the Elastic Net soft thresholding. Then, we weight the selected predictors using the Elastic Net regression coefficient, and we finally apply the principal component analysis to the new “elastically” weighted data matrix. We compare this method to common benchmark and other methods to forecast macroeconomic variables in a data-rich environment, dived into dense representation, such as Dynamic Factor Models and Ridge regressions and sparse representations, such as LASSO regression. All these models are adapted to take into account the linear dependency of the macroeconomic time series. Moreover, to estimate the hyperparameters of these models, including the EWPCA, we propose a new procedure called “brute force”. This method allows us to treat all the hyperparameters of the model uniformly and to take the longitudinal feature of the time-series data into account. Our findings can be summarized as follows. First, the “brute force” method to estimate the hyperparameters is more stable and gives better forecasting performances, in terms of MSFE, than the traditional criteria used in the literature to tune the hyperparameters. This result holds for all samples sizes and forecasting horizons. Secondly, our two-step forecasting procedure enhances the forecasts’ interpretability. Lastly, the EWPCA leads to better forecasting performances, in terms of mean square forecast error (MSFE), than the other sparse and dense methods or naïve benchmark, at different forecasts horizons and sample sizes.
    Keywords: Variable selection,High-dimensional time series,Dynamic factor models,Shrinkage methods,Cross-validation
    JEL: C22 C52 C53
    Date: 2019–02–14
    URL: http://d.repec.org/n?u=RePEc:rtv:ceisrp:452&r=all
  5. By: Christian Grimme; Robert Lehmann; Marvin Noeller
    Abstract: Globalization has led to huge increases in import volumes, increasing the importance of imports for total output. Since imports are a volatile component, they are difficult to forecast and strongly influence the forecast accuracy of gross domestic product. We introduce the first leading indicator constructed to forecast import growth, the Import Climate. It builds on the idea that the import demand of the domestic country should be reflected in the expected export developments of its main trading partners. A foreign country’s expected exports are, in turn, determined by its trading partners’ business and consumer confidence and its own price competitiveness. In a real-time forecasting experiment, the Import Climate outperforms standard business cycle indicators at short horizons for France, Germany, Italy, and the United States for the first release of data. For Spain and the United Kingdom, our indicator works particularly well with the latest vintage of data.
    Keywords: Import climate, import forecasting, survey data, price competitiveness
    JEL: F01 F10 F17
    Date: 2019
    URL: http://d.repec.org/n?u=RePEc:ces:ifowps:_294&r=all
  6. By: Ozer Karagedikli; Shaun P. Vahey; Elizabeth C. Wakerly
    Abstract: Many studies have found that combining forecasts improves predictive accuracy. An often-used approach developed by Granger and Ramanathan (GR, 1984) utilises a linear-Gaussian regression model to combine point forecasts. This paper generalises their approach for an asymmetrically distributed target variable. Our copula point forecast combination methodology involves fitting marginal distributions for the target variable and the individual forecasts being combined; and then estimating the correlation parameters capturing linear dependence between the target and the experts’ predictions. If the target variable and experts’ predictions are individually Gaussian distributed, our copula point combination reproduces the GR combination. We illustrate our methodology with two applications examining quarterly forecasts for the Federal Funds rate and for US output growth, respectively. The copula point combinations outperform the forecasts from the individual experts in both applications, with gains in root mean squared forecast error in the region of 40% for the Federal Funds rate and 4% for output growth relative to the GR combination. The fitted marginal distribution for the interest rate exhibits strong asymmetry.
    Keywords: Forecast combination, Copula modelling, Interest rates, Vulnerable economic growth
    Date: 2019–02
    URL: http://d.repec.org/n?u=RePEc:een:camaaa:2019-15&r=all
  7. By: Federico Bassetti (Politecnico of Milan, Italy); Roberto Casarin (University Ca' Foscari of Venice, Italy); Francesco Ravazzolo (Free University of Bolzano‐Bozen Faculty of Economics, Italy and BI Norwegian Business School)
    Abstract: This paper reviews different methods to construct density forecasts and to aggregate forecasts from many sources. Density evaluation tools to measure the accuracy of density forecasts are reviewed and calibration methods for improving the accuracy of forecasts are presented. The manuscript provides some numerical simulation tools to approximate predictive densities with a focus on parallel computing on graphical process units. Some simple examples are proposed to illustrate the methods.
    Keywords: Density forecasting, density combinations, density evaluation, boot-strapping, Bayesian inference, Monte Carlo simulations, GPU computing
    JEL: C10 C11 C15 C53 C63
    Date: 2019–02
    URL: http://d.repec.org/n?u=RePEc:bzn:wpaper:bemps59&r=all
  8. By: Matus Kubik (Council for Budget Responsibility); Pavol Majher (Council for Budget Responsibility)
    Abstract: In this paper, we develop a framework to assess and communicate short-term risks to the general government budget. The project is motivated by the need to keep budgetary developments under constant surveillance and to timely identify sources of fiscal stress, such that the government can mitigate or even eliminate these by taking appropriate measures. We consider a concept of fiscal risk defined in terms of the expected deviation of the budget balance from its fiscal target. Our framework consists of three parts. First, we collect data input from various sources and identify potential one-off effects. Second, we forecast end-of-year values for revenue and expenditure items of the budget. We design a forecasting method that combines a simple heuristic method of time series forecasting with expert assessments. Third, we use the forecasted output to evaluate and communicate the fiscal risk level for a current fiscal year. Specifically, we use a concept of traffic light colors to report expected deviations of fiscal outcomes from their budgeted targets.
    Keywords: government budget balance, fiscal risk, budget forecast
    JEL: C33 C53 H60 H68
    Date: 2018–12
    URL: http://d.repec.org/n?u=RePEc:cbe:dpaper:201802&r=all

This nep-for issue is ©2019 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.