nep-for New Economics Papers
on Forecasting
Issue of 2018‒02‒26
six papers chosen by
Rob J Hyndman
Monash University

  1. Approximate Bayesian forecasting By David T. Frazier; Worapree Maneesoonthorn; Gael M. Martin; Brendan P.M. McCabe
  2. Optimal forecast reconciliation for hierarchical and grouped time series through trace minimization By Shanika L. Wickramasuriya; George Athanasopoulos; Rob J. Hyndman
  3. An approach to increasing forecast-combination accuracy through VAR error modeling By Till Weigt; Bernd Wilfling
  4. Predicting earnings and cash flows: The information content of losses and tax loss carryforwards By Dreher, Sandra; Eichfelder, Sebastian; Noth, Felix
  5. Mixed frequency models with MA components By Foroni, Claudia; Marcellino, Massimiliano; Stevanović, Dalibor
  6. The analysis and forecasting of ATP tennis matches using a high-dimensional dynamic model By P. Gorgi; Siem Jan (S.J.) Koopman; R. Lit

  1. By: David T. Frazier; Worapree Maneesoonthorn; Gael M. Martin; Brendan P.M. McCabe
    Abstract: Approximate Bayesian Computation (ABC) has become increasingly prominent as a method for conducting parameter inference in a range of challenging statistical problems, most notably those characterized by an intractable likelihood function. In this paper, we focus on the use of ABC not as a tool for parametric inference, but as a means of generating probabilistic forecasts; or for conducting what we refer to as ‘approximate Bayesian forecasting’. The four key issues explored are: i) the link between the theoretical behavior of the ABC posterior and that of the ABC-based predictive; ii) the use of proper scoring rules to measure the (potential) loss of forecast accuracy when using an approximate rather than an exact predictive; iii) the performance of approximate Bayesian forecasting in state space models; and iv) the use of forecast accuracy to inform the selection of ABC summaries in empirical settings. The primary finding of the paper is that ABC can provide a computationally ecient means of generating probabilistic forecasts that are nearly identical to those produced by the exact predictive, and in a fraction of the time required to produce predictions via an exact method.
    Keywords: Bayesian prediction, likelihood-free methods, predictive merging, proper scoring rules, particle filtering, jump-diffusion models.
    JEL: C11 C53 C58
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2018-2&r=for
  2. By: Shanika L. Wickramasuriya; George Athanasopoulos; Rob J. Hyndman
    Abstract: Large collections of time series often have aggregation constraints due to product or geographical groupings. The forecasts for the most disaggregated series are usually required to add-up exactly to the forecasts of the aggregated series, a constraint we refer to as "coherence". Forecast reconciliation is the process of adjusting forecasts to make them coherent. The reconciliation algorithm proposed by Hyndman et al. (2011) is based on a generalized least squares estimator that requires an estimate of the covariance matrix of the coherency errors (i.e., the errors that arise due to incoherence). We show that this matrix is impossible to estimate in practice due to identifiability conditions. We propose a new forecast reconciliation approach that incorporates the information from a full covariance matrix of forecast errors in obtaining a set of coherent forecasts. Our approach minimizes the mean squared error of the coherent forecasts across the entire collection of time series under the assumption of unbiasedness. The minimization problem has a closed form solution. We make this solution scalable by providing a computationally efficient representation. We evaluate the performance of the proposed method compared to alternative methods using a series of simulation designs which take into account various features of the collected time series. This is followed by an empirical application using Australian domestic tourism data. The results indicate that the proposed method works well with artificial and real data.
    Keywords: Aggregation, Australian tourism, Coherent forecasts, contemporaneous error correlation, forecast combinations, spatial correlations.
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2017-22&r=for
  3. By: Till Weigt; Bernd Wilfling
    Abstract: We consider a situation in which the forecaster has available M individual forecasts of a univariate target variable. We propose a 3-step procedure designed to exploit the interrelationships among the M forecast-error series (estimated from a large time-varying parameter VAR model of the errors, using past observations) with the aim of obtaining more accurate predictions of future forecast errors. The refined future forecast-error predictions are then used to obtain M new individual forecasts that are adapted to the information from the estimated VAR. The adapted M individual forecasts are ultimately combined and any potential accuracy gains of the adapted combination forecasts analyzed. We evaluate our approach in an out-of-sample forecasting analysis, using a well-established 7-country data set on output growth. Our 3-step procedure yields substantial accuracy gains (in terms of loss reductions ranging between 6.2% up to 18%) for the simple average and three time-varying-parameter combination forecasts.
    Keywords: Forecast combinations, large time-varying parameter VARs, Bayesian VAR estimation, state-space model, forgetting factors, dynamic model averaging.
    JEL: C53 C32 C11
    Date: 2018–02
    URL: http://d.repec.org/n?u=RePEc:cqe:wpaper:6818&r=for
  4. By: Dreher, Sandra; Eichfelder, Sebastian; Noth, Felix
    Abstract: We analyse the relevance of losses, accounting information on tax loss carryforwards, and deferred taxes for the prediction of earnings and cash flows up to four years ahead. We use a unique hand-collected panel of German listed firms encompassing detailed information on tax loss carryforwards and deferred taxes from the tax footnote. Our out-of-sample predictions show that considering accounting information on tax loss carryforwards and deferred taxes does not enhance the accuracy of performance forecasts and can even worsen performance predictions. We find that common forecasting approaches that treat positive and negative performances equally or that use a dummy variable for negative performance can lead to biased performance forecasts, and we provide a simple empirical specification to account for that issue.
    Keywords: performance forecast,in-sample prediction,out-of-sample prediction,loss persistence,deferred taxes,tax loss carryforwards
    JEL: C53 M40 M41
    Date: 2017
    URL: http://d.repec.org/n?u=RePEc:zbw:iwhdps:302017&r=for
  5. By: Foroni, Claudia; Marcellino, Massimiliano; Stevanović, Dalibor
    Abstract: Temporal aggregation in general introduces a moving average (MA) component in the aggregated model. A similar feature emerges when not all but only a few variables are aggregated, which generates a mixed frequency model. The MA component is generally neglected, likely to preserve the possibility of OLS estimation, but the consequences have never been properly studied in the mixed frequency context. In this paper, we show, analytically, in Monte Carlo simulations and in a forecasting application on U.S. macroeconomic variables, the relevance of considering the MA component in mixed-frequency MIDAS and Unrestricted-MIDAS models (MIDASARMA and UMIDAS-ARMA). Specifically, the simulation results indicate that the short-term forecasting performance of MIDAS-ARMA and UMIDAS-ARMA is better than that of, respectively, MIDAS and UMIDAS. The empirical applications on nowcasting U.S. GDP growth, investment growth and GDP deflator inflation confirm this ranking. Moreover, in both simulation and empirical results, MIDAS-ARMA is better than UMIDAS-ARMA.
    Keywords: temporal aggregation,MIDAS models,ARMA models
    JEL: E37 C53
    Date: 2018
    URL: http://d.repec.org/n?u=RePEc:zbw:bubdps:022018&r=for
  6. By: P. Gorgi (VU Amsterdam, The Netherlands); Siem Jan (S.J.) Koopman (VU Amsterdam, The Netherlands; CREATES Aarhus University, Denmark); R. Lit (VU Amsterdam, The Netherlands)
    Abstract: We propose a basic high-dimensional dynamic model for tennis match results with time varying player-specific abilities for different court surface types. Our statistical model can be treated in a likelihood-based analysis and is capable of handling high-dimensional datasets while the number of parameters remains small. In particular, we analyze 17 years of tennis matches for a panel of over 500 players, which leads to more than 2000 dynamic strength levels. We find that time varying player-specific abilities for different court surfaces are of key importance for analyzing tennis matches. We further consider several other extensions including player-specific explanatory variables and the accountance of specific configurations for Grand Slam tournaments. The estimation results can be used to construct rankings of players for different court surface types. We finally show that our proposed model can also be effective in forecasting. We provide evidence that our model significantly outperforms existing models in the forecasting of tennis match results.
    Keywords: Sports statistics; Score-driven time series models; Rankings; Forecasting.
    JEL: C32 C53
    Date: 2018–01–26
    URL: http://d.repec.org/n?u=RePEc:tin:wpaper:20180009&r=for

This nep-for issue is ©2018 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.