nep-for New Economics Papers
on Forecasting
Issue of 2016‒10‒02
ten papers chosen by
Rob J Hyndman
Monash University

  1. Global inflation forecasts By Jonathan Kearns
  2. A Quantile Regression Model for Electricity Peak Demand Forecasting: An Approach to Avoiding Power Blackouts By Niematallah Elamin; Mototsugu Fukushige
  3. A wavelet-based multivariate multiscale approach for forecasting By António Rua
  4. Bias in Official Fiscal Forecasts: Can Private Forecasts Help? By Frankel, Jeffrey A.; Schreger, Jesse
  5. Intuitive and reliable estimates of the output gap from a Beveridge-Nelson filter By Güneş Kamber; James Morley; Benjamin Wong
  6. The Depreciation of the Pound Post-Brexit: Could it have been Predicted? By Vasilios Plakandaras; Rangan Gupta; Mark E. Wohar
  7. Time-Adaptive Probabilistic Forecasts of Electricity Spot Prices with Application to Risk Management. By Brenda López Cabrera; Franziska Schulz; ;
  8. Quality Predictability and the Welfare Benefits from New Products: Evidence from the Digitization of Recorded Music By Luis Aguiar; Joel Waldfogel
  9. Forecasting banking crises with dynamic panel probit models By António R. Antunes; Diana Bonfim; Nuno Monteiro; Paulo M.M. Rodrigues
  10. PARX model for football matches predictions By Giovanni Angelini; Luca De Angelis

  1. By: Jonathan Kearns
    Abstract: Inflation co-moves across countries and several papers have shown that lags of this common inflation can help to forecast country inflation. This paper constructs forecasts of common (or 'global') inflation using survey forecasts of country inflation. These forecasts of global inflation have predictive power for global inflation at a medium horizon (12 months) but not at a longer horizon. Global inflation forecasts, and forecast errors, are correlated with survey forecasts and errors of oil and food prices, and global GDP growth, but not financial variables. For some countries, forecasts of global inflation improve the accuracy of forecasting regressions that include survey forecasts of country inflation. In-sample fit and out-of-sample forecasting exercises suggest that forecasts of global inflation generally contain more information for forecasting country inflation than do lags of global inflation. However, for most countries, lagged or forecast global inflation does not improve the accuracy of survey forecasts of country inflation. Whatever information global inflation may include about country inflation, for most countries it seems that survey forecasts of country inflation have historically already incorporated that information.
    Keywords: Global inflation, inflation forecasts, survey forecasts
    Date: 2016–09
  2. By: Niematallah Elamin (Graduate School of Economics, Osaka University); Mototsugu Fukushige (Graduate School of Economics, Osaka University)
    Abstract: Electricity peak demand forecasting is a key exercise undertaken to avoid power blackouts and system failure. In this paper, the next day's load peak demand is estimated and forecasted. The challenge is to generate a peak demand forecast that is capable of avoiding the risk of a power blackout. We take an empirical approach to the question of estimating quantiles to indicate forecast uncertainty. Point forecasts generated from quantile regression are compared with the prediction intervals of linear regression. In addition, and to justify the result, their out-of-sample forecasting performance is compared. Distinctively from previous studies on load forecasting, models are evaluated based on their ability to avoid under-prediction i.e. avoid the risk of power blackouts. The analysis shows that quantile regression tends to under predict less than linear regression. Thus quantile regression is more appropriate for avoiding power blackouts.
    Keywords: Electricity peak demand, Quantile regression, Prediction intervals, Blackout
    JEL: Q47 C53 L94
    Date: 2016–09
  3. By: António Rua
    Abstract: In an increasingly data rich environment, factor models have become the workhorse approach for modelling and forecasting purposes. However, factors are non-observable and have to be estimated. In particular, the space spanned by the unknown factors is typically estimated via principal components. Herein, it is proposed a novel procedure to estimate the factor space resorting to a wavelet based multiscale principal component analysis. Through a Monte Carlo simulation study, it is shown that such an approach allows to improve both factor model estimation and forecasting performance. In the empirical application, one illustrates its usefulness for forecasting GDP growth and inflation in the United States.
    JEL: C22 C40 C53
    Date: 2016
  4. By: Frankel, Jeffrey A. (Harvard University); Schreger, Jesse (Harvard University and Princeton University)
    Abstract: Government forecasts of GDP growth and budget balances are generally more over-optimistic than private sector forecasts. When official forecasts are especially optimistic relative to private forecasts ex ante, they are more likely also to be over-optimistic relative to realizations ex post. For example, euro area governments during the period 1999-2007 assiduously and inaccurately avoided forecasting deficit levels that would exceed the 3% Stability and Growth Pact threshold; meanwhile private sector forecasters were not subject to this crude bias. As a result, the budget-making process could probably be improved by using private-sector forecasts.
    JEL: E63
    Date: 2016–05
  5. By: Güneş Kamber; James Morley; Benjamin Wong
    Abstract: The Beveridge-Nelson (BN) trend-cycle decomposition based on autoregressive forecasting models of U.S. quarterly real GDP growth produces estimates of the output gap that are strongly at odds with widely-held beliefs about the amplitude, persistence, and even sign of transitory movements in economic activity. These antithetical attributes are related to the autoregressive coefficient estimates implying a very high signal-to-noise ratio in terms of the variance of trend shocks as a fraction of the overall quarterly forecast error variance. When we impose a lower signal-to-noise ratio, the resulting BN decomposition, which we label the "BN filter", produces a more intuitive estimate of the output gap that is large in amplitude, highly persistent, and typically positive in expansions and negative in recessions. Real-time estimates from the BN filter are also reliable in the sense that they are subject to smaller revisions and predict future output growth and inflation better than for other methods of trend-cycle decomposition that also impose a low signal-to-noise ratio, including deterministic detrending, the Hodrick-Prescott filter, and the bandpass filter.
    Keywords: Beveridge-Nelson decomposition, output gap, signal-to-noise ratio
    Date: 2016–09
  6. By: Vasilios Plakandaras (Department of Economics, Democritus University of Thrace, Greece); Rangan Gupta (Department of Economics, University of Pretoria, South Africa); Mark E. Wohar (College of Business Administration, University of Nebraska at Omaha USA, and School of Business and Economics, Loughborough University)
    Abstract: The decision of the United Kingdom to leave the European Union (Brexit) after 43 years caused turmoil in exchange rate and global stock markets. More specifically, the pound relative to the dollar has lost close to 15 percent of its value in the weeks after the Brexit decision. In this paper we attempt to examine whether this sudden depreciation of the (pound-dollar) exchange rate is the reaction of market participants to the Brexit or whether the exodus of UK from the EU had little impact on the exchange rate. In doing so, we train linear and nonlinear econometric and machine learning models and evaluate out-of-sample forecasts of the exchange rate and its realized volatility in the pre- and post-Brexit period. We quantify uncertainty caused by the Brexit according to an index based on news related to economic uncertainty. We argue that in daily forecasting horizon our models adhere closely to the evolution of the exchange rate and that most of the depreciation is based on the uncertainty caused by the Brexit.
    Keywords: Brexit, Economic Uncertainty, Machine Learning
    JEL: F31 F37
    Date: 2016–09
  7. By: Brenda López Cabrera; Franziska Schulz; ;
    Abstract: The increasing exposure to renewable energy has amplied the need for risk management in electricity markets. Electricity price risk poses a major challenge to market participants. We propose an approach to model and fore- cast electricity prices taking into account information on renewable energy production. While most literature focuses on point forecasting, our method- ology forecasts the whole distribution of electricity prices and incorporates spike risk, which is of great value for risk management. It is based on func- tional principal component analysis and time-adaptive nonparametric density estimation techniques. The methodology is applied to electricity market data from Germany. We nd that renewable infeed eects both, the location and the shape of spot price densities. A comparison with benchmark methods and an application to risk management are provided.
    JEL: C1 Q41 Q47
    Date: 2016–08
  8. By: Luis Aguiar; Joel Waldfogel
    Abstract: We explore the consequence of quality unpredictability for the welfare benefit of new products, using recent developments in recorded music as our context. Digitization has expanded consumption opportunities by giving consumers access to the “long tail” of existing products, rather than simply the popular products that a retailer might stock with limited shelf space. While this is clearly beneficial to consumers, the benefits are somewhat limited: given the substitutability among differentiated products, the incremental benefit of obscure products - even lots of them - can be small. But digitization has also reduced the cost of bringing new products to market, giving rise to a different sort of long tail, in production. If the appeal of new products is unpredictable at the time of investment, as is the case for cultural products as well as many others, then creating new products can have substantial welfare benefits. Technological change in the recorded music industry tripled the number of new products between 2000 and 2008. We quantify the effects of new music on welfare using a simple illustrative, but explicitly structural, model of demand and entry with potentially unpredictable product quality. Based on a range of plausible forecasting models of expected appeal, a tripling of the choice set according to expected quality adds substantially more to consumer surplus and overall welfare than the usual long-tail benefits from a tripling of the choice set according to realized quality, perhaps by more than an order of magnitude.
    JEL: L15 L81
    Date: 2016–09
  9. By: António R. Antunes; Diana Bonfim; Nuno Monteiro; Paulo M.M. Rodrigues
    Abstract: Banking crises are rare events, but when they occur their consequences are often dramatic. The aim of this paper is to contribute to the toolkit of early warning models available to policy makers by exploring the dynamics and non-linearities embedded in a panel dataset covering several countries over four decades (from 1970Q1 to 2010Q4). The in-sample and out-of-sample forecast performance of several dynamic probit models is evaluated, with the objective of developing a common vulnerability indicator with early warning properties. The results obtained show that adding dynamic components and exuberance indicators to the models substantially improves the ability to forecast banking crises.
    JEL: C12 C22
    Date: 2016
  10. By: Giovanni Angelini (Università di Bologna); Luca De Angelis (Università di Bologna)
    Abstract: We propose an innovative approach to model and predict the outcome of football matches based on the Poisson Autoregression with eXogenous covariates (PARX) model recently proposed by Agosto, Cavaliere, Kristensen and Rahbek (2016). We show that this methodology is particularly suited to model the goals distribution of a football team and provides a good forecast performance that can be exploited to develop a profitable betting strategy. The betting strategy is based on the idea that the odds proposed by the market do not reflect the true probability of the match because they may incorporate also the betting volumes or strategic price settings in order to exploit bettors’ biases. The out-of-sample performance of the PARX model is better than the reference approach by Dixon and Coles (1997). We also evaluate our approach in a simple betting strategy which is applied to the English football Premier League data for the 2013/2014 and 2014/2015 seasons. The results show that the return from the betting strategy is larger than 35% in all the cases considered and may even exceed 100% if we consider an alternative strategy based on a predetermined threshold which allows to exploit the inefficiency of the betting market.
    Keywords: Sports forecasting, Density forecasts, Count data, Poisson autoregression, Bet- ting market. Previsioni sportive, Previsioni di densità, Dati di conteggio, Autoregressione di Poisson, Mercato delle scommesse.
    Date: 2016

This nep-for issue is ©2016 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.