nep-for New Economics Papers
on Forecasting
Issue of 2014‒11‒07
nine papers chosen by
Rob J Hyndman
Monash University

  1. Evaluating Conditional Forecasts from Vector Autoregressions By Clark, Todd E.; McCracken, Michael W.
  2. Real-Time Factor Model Forecasting and the Effects of Instability By Michael P. Clements
  3. Dynamic prediction pools: an investigation of financial frictions and forecasting performance By Del Negro, Marco; Hasegawa, Raiden B.; Schorfheide, Frank
  4. Log versus Level in VAR Forecasting: 42 Million Empirical Answers - Expect the Unexpected By Johannes Mayr; Dirk Ulbricht
  5. A Comparative Analysis of the Accuracy and Uncertainty in Real Estate and Macroeconomic Forecasts By Dimitrios Papastamos; George Matysiak; Simon Stevenson
  6. Bayesian Tail Risk Forecasting using Realised GARCH By Contino, Christian; Gerlach, Richard H.
  7. Random switching exponential smoothing and inventory forecasting By Giacomo Sbrana; Andrea Silvestrini
  8. Forecast Models for Private Consumption By Peussa, Aleksandr
  9. Forecasting Natural Population Change: the Case of Latvia By Aleksejs Melihovs

  1. By: Clark, Todd E. (Federal Reserve Bank of Cleveland); McCracken, Michael W. (Federal Reserve Bank of St. Louis)
    Abstract: Many forecasts are conditional in nature. For example, a number of central banks routinely report forecasts conditional on particular paths of policy instruments. Even though conditional forecasting is common, there has been little work on methods for evaluating conditional forecasts. This paper provides analytical,Monte Carlo, and empirical evidence on tests of predictive ability for conditional forecasts from estimated models. In the empirical analysis, we consider forecasts of growth, unemployment, and inflation from a VAR, based on conditions on the short-term interest rate. Throughout the analysis, we focus on tests of bias, efficiency, and equal accuracy applied to conditional forecasts from VAR models.
    Keywords: Prediction; forecasting; out-of-sample
    JEL: C12 C32 C52 C53
    Date: 2014–10–02
  2. By: Michael P. Clements (ICMA Centre, Henley Business School, University of Reading)
    Abstract: We show that factor forecasting models deliver real-time gains over autoregressive models for US real activity variables during the recent period, but are less successful for nominal variables. The gains are largely due to the Financial Crisis period, and are primarily at the shortest (one quarter ahead) horizon. Excluding the pre-Great Moderation years from the factor forecasting model estimation period (but not from the data used to extract factors) results in a marked fillip in factor model forecast accuracy, but does the same for the AR model forecasts. The relative performance of the factor models compared to the AR models is largely unaffected by whether the exercise is in real time or is pseudo out-of-sample.
    Keywords: Factor Models, Robust Approaches, Financial Crisis
    JEL: C51 C22
    Date: 2014–05
  3. By: Del Negro, Marco (Federal Reserve Bank of New York); Hasegawa, Raiden B.; Schorfheide, Frank
    Abstract: We provide a novel methodology for estimating time-varying weights in linear prediction pools, which we call dynamic pools, and use it to investigate the relative forecasting performance of dynamic stochastic general equilibrium (DSGE) models, with and without financial frictions, for output growth and inflation in the period 1992 to 2011. We find strong evidence of time variation in the pool’s weights, reflecting the fact that the DSGE model with financial frictions produces superior forecasts in periods of financial distress but doesn’t perform as well in tranquil periods. The dynamic pool’s weights react in a timely fashion to changes in the environment, leading to real-time forecast improvements relative to other methods of density forecast combination, such as Bayesian model averaging, optimal (static) pools, and equal weights. We show how a policymaker dealing with model uncertainty could have used a dynamic pool to perform a counterfactual exercise (responding to the gap in labor market conditions) in the immediate aftermath of the Lehman crisis.
    Keywords: Bayesian estimation; DSGE models; financial frictions; forecasting; Great Recession; linear prediction pools
    JEL: C53 E31 E32 E37
    Date: 2014–10–01
  4. By: Johannes Mayr; Dirk Ulbricht
    Abstract: The use of log-transformed data has become standard in macroeconomic forecasting with VAR models. However, its appropriateness in the context of out-of-sample forecasts has not yet been exposed to a thorough empirical investigation. With the aim of filling this void, a broad sample of VAR models is employed in a multi-country set up and approximately 42 Mio. pseudo-out-of-sample forecasts of GDP are evaluated. The results show that, on average, the knee-jerk transformation of the data is at best harmless.
    Keywords: VAR-forecasting, Logarithmic transformation
    JEL: C52 C53
    Date: 2014
  5. By: Dimitrios Papastamos (Eurobank EFG Property Services S.A); George Matysiak (Master Management Group and Krakow University of Economics); Simon Stevenson (School of Real Estate & Planning, Henley Business School, University of Reading)
    Abstract: We compare and contrast the accuracy and uncertainty in forecasts of rents with those for a variety of macroeconomic series. The results show that in general forecasters tend to be marginally more accurate in the case of macro-economic series than with rents. In common across all of the series, forecasts tend to be smoothed with forecasters under-estimating performance during economic booms, and vice-versa in recessions We find that property forecasts are affected by economic uncertainty, as measured by disagreement across the macro-forecasters. Increased uncertainty leads to increased dispersion in the rental forecasts and a reduction in forecast accuracy.
    Date: 2014–05
  6. By: Contino, Christian; Gerlach, Richard H.
    Abstract: A Realised Volatility GARCH model is developed within a Bayesian framework for the purpose of forecasting Value at Risk and Conditional Value at Risk. Student-t and Skewed Student-t return distributions are combined with Gaussian and Student-t distributions in the measurement equation in a GARCH framework to forecast tail risk in eight international equity index markets over a four year period. Three Realised Volatility proxies are considered within this framework. Realised Volatility GARCH models show a marked improvement compared to ordinary GARCH for both Value at Risk and Conditional Value at Risk forecasting. This improvement is consistent across a variety of data, volatility model speci_cations and distributions, and demonstrates that Realised Volatility is superior when producing volatility forecasts. Realised Volatility models implementing a Skewed Student-t distribution for returns in the GARCH equation are favoured.
    Keywords: Risk Management; Expected Shortfall; High-Frequency Data; CVaR; Value-at-Risk; GARCH; Realised Volatility
    Date: 2014–10–10
  7. By: Giacomo Sbrana (NEOMA Business School); Andrea Silvestrini (Bank of Italy, Economic Research Department)
    Abstract: Exponential smoothing models are an important prediction tool in macroeconomics, finance and business. This paper presents the analytical forecasting properties of the random coefficient exponential smoothing model in the multiple source of error framework. The random coefficient state-space representation allows for switching between simple exponential smoothing and the local linear trend. Therefore it is possible to control, in a flexible manner, the random changing dynamic behaviour of the time series. The paper establishes the algebraic mapping between the state-space parameters and the implied reduced form ARIMA parameters. In addition, it shows that parametric mapping surmounts the difficulties that are likely to emerge in a direct estimatation of the random coefficient state-space model. Finally, it presents an empirical application comparing the forecast accuracy of the suggested model vis-à-vis other benchmark models, both in the ARIMA and in the Exponential Smoothing class. Using time series relative to wholesalers’ inventories in the USA, the out-of-sample results show that the reduced form of the random coefficient exponential smoothing model tends to be superior to its competitors.
    Keywords: exponential smoothing, ARIMA, inventory, forecasting.
    Date: 2014–07
  8. By: Peussa, Aleksandr
    Abstract: The share of private consumption in gross domestic product is significant; therefore, private consumption has a great influence on economic growth, which makes it a major concept in economics. The purpose of the paper is to estimate and evaluate different forecasting models for private consumption. The first part of the paper focuses on the aggregate consumption. The models are estimated using yearly and quarterly data. The goal of second part of the paper is to estimate and evaluate forecasting models for the components of private consumption. Private consumption can be divided by the duration principle or by product categories. There are three competing statistical models for components of private consumption. All models are presented in the second part of the report and the aim is to choose the best model using statistical methods of model evaluation (R-squared, AIC, BIC).
    Keywords: Aggregate consumption, private consumption, economic forecasts, logistic regression
    JEL: C43 C52 C53 E21 E27 C82
    Date: 2014–10–14
  9. By: Aleksejs Melihovs
    Abstract: The paper is devoted to the natural population change forecast in Latvia for the time horizon until 2030. The motivation for this paper is twofold. First, population ageing is an obvious problem for the whole EU with a tendency to worsen in the future. Second, historical population data have been revised based on the results of the last population census that took place in Latvia in 2011. This data correction could help to make a clearer vision of future tendencies in demographic indicators. However, for EU11 countries, including Latvia, the situation is more challenging. The approach developed in 2007 by Hyndman and Ullah is used for the natural population change forecasting. This approach combines functional data analysis and principal components decomposition. Although the applied approach is a technical one, it is useful for understanding what a policy maker could deal with in 15–20 years from now in the case of no-policy-change and no-population-habits-change scenario. By understanding this issue, it could be easier for the policy makers to make right decisions with a long-run perspective helping population and economy to be prepared well for the problems associated with population ageing that will accumulate in the future. The model is used to forecast mortality rate schedules separately for males and females as well as fertility rate schedules. The main findings of the paper are the following. The total period fertility rate is forecasted to increase to about 1.6 by 2030. Life expectancy at birth is projected to increase for males and females by 4 and 3.4 years respectively. Nevertheless, the natural population decrease in 19 years will reach 200 thousand including the decrease of about 190 thousand in population aged 20–64, while the old-age dependency ratio will increase to 36.5%.
    Keywords: functional approach, fertility rates, mortality rates, population forecasting
    JEL: J11 C53 C14 C32 O11 O52
    Date: 2014–10–06

This nep-for issue is ©2014 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.