nep-for New Economics Papers
on Forecasting
Issue of 2014‒04‒11
eight papers chosen by
Rob J Hyndman
Monash University

  1. Robust Approaches to Forecasting By Jennifer Castle; David Hendry; Michael P. Clements
  2. A Likelihood Ratio and Markov Chain Based Method to Evaluate Density Forecasting By Li, Yushu; Andersson, Jonas
  3. Forecasting Co-Volatilities via Factor Models with Asymmetry and Long Memory in Realized Covariance By Manabu Asai; Michael McAleer
  4. Outperforming IMF Forecasts by the Use of Leading Indicators By Katja Drechsel; S. Giesen; Axel Lindner
  5. Information in the yield curve: A Macro-Finance approach By Hans Dewachter; Leonardo Iania; Marco Lyrio
  6. Assessing Point Forecast Accuracy by Stochastic Divergence from Zero By Francis X. Diebold; Minchul Shin
  7. Bagging Exponential Smoothing Methods using STL Decomposition and Box-Cox Transformation By Christoph Bergmeir; Rob J Hyndman; Jose M Benitez
  8. Forecasting the Volatility of the Dow Jones Islamic Stock Market Index: Long Memory vs. Regime Switching By Adnen Ben Nasr; Thomas Lux; Ahdi N. Ajmi; Rangan Gupta

  1. By: Jennifer Castle; David Hendry; Michael P. Clements
    Abstract: We investigate alternative robust approaches to forecasting, using a new class of robust devices, contrasted with equilibrium correction models.� Their forecasting properties are derived facing a range of likely empirical problems at the forecast origin, including measurement errors, implulses, omitted variables, unanticipated location shifts and incorrectly included variables that experience a shift.� We derive the resulting forecast biases and error variances, and indicate when the methods are likely to perform well.� The robust methods are applied to forecasting US GDP using autoregressive models, and also to autoregressive models with factors extracted from a large dataset of macroeconomic variables.� We consider forecasting performance over the Great Recession, and over an earlier more quiescent period.
    Keywords: Robust forecasts, Smoothed Forecasting devices, Factor models, GDP forecasts, Location shifts
    JEL: C51 C53
    Date: 2014–01–30
    URL: http://d.repec.org/n?u=RePEc:oxf:wpaper:697&r=for
  2. By: Li, Yushu (Dept. of Business and Management Science, Norwegian School of Economics); Andersson, Jonas (Dept. of Business and Management Science, Norwegian School of Economics)
    Abstract: In this paper, we propose a likelihood ratio and Markov chain based method to evaluate density forecasting. This method can jointly evaluate the unconditional forecasted distribution and dependence of the outcomes. This method is an extension of the widely applied evaluation method for interval forecasting proposed by Christoffersen (1998). It is also a more refined approach than the pure contingency table based density forecasting method in Wallis (2003). We show that our method has very high power against incorrect forecasting distributions and dependence. Moreover, the straightforwardness and ease of application of this joint test provide a high potentiality for further applications in both financial and economical areas.
    Keywords: Likelihood ratio test; Markov Chain; Density forecasting
    JEL: C14 C53 C61
    Date: 2014–03–25
    URL: http://d.repec.org/n?u=RePEc:hhs:nhhfms:2014_012&r=for
  3. By: Manabu Asai (Faculty of Economics, Soka University); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute, The Netherlands, Department of Quantitative Economics, Complutense University of Madrid, and Institute of Economic Research, Kyoto University.)
    Abstract: Modelling covariance structures is known to suffer from the curse of dimensionality. In order to avoid this problem for forecasting, the authors propose a new factor multivariate stochastic volatility (fMSV) model for realized covariance measures that accommodates asymmetry and long memory. Using the basic structure of the fMSV model, the authors extend the dynamic correlation MSV model, the onditional/stochastic Wishart autoregressive models, the matrix-exponential MSV model, and the Cholesky MSV model. Empirical results for 7 financial asset returns for US stock returns indicate that the new fMSV models outperform existing dynamic conditional correlation models for forecasting future covariances. Among the new fMSV models, the Cholesky MSV model with long memory and asymmetry shows stable and better forecasting performance for one-day, five-day and ten-day horizons in the periods before, during and after the global financial crisis.
    Keywords: Dimension reduction; Factor Model; Multivariate Stochastic Volatility; Leverage Effects; Long Memory; Realized Volatility.
    JEL: C32 C53 C58 G17
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1405&r=for
  4. By: Katja Drechsel; S. Giesen; Axel Lindner
    Abstract: This study analyzes the performance of the IMF World Economic Outlook forecasts for world output and the aggregates of both the advanced economies and the emerging and developing economies. With a focus on the forecast for the current and the next year, we examine whether IMF forecasts can be improved by using leading indicators with monthly updates. Using a real-time dataset for GDP and for the indicators we find that some simple single-indicator forecasts on the basis of data that are available at higher frequency can significantly outperform the IMF forecasts if the publication of the Outlook is only a few months old.
    Keywords: IMF WEO forecasts, leading indicators, real-time data
    JEL: C52 C53 E02 E32 E37 O19
    Date: 2014–04
    URL: http://d.repec.org/n?u=RePEc:iwh:dispap:4-14&r=for
  5. By: Hans Dewachter (National Bank of Belgium, Research Department; Center for Economic Studies, University of Leuven; CESifo); Leonardo Iania (National Bank of Belgium, Research Department; Louvain School of Management (UCL)); Marco Lyrio (Insper Institute of Education and Research)
    Abstract: We use a macro-finance model, incorporating macroeconomic and financial factors, to study the term premium in the U.S. bond market. Estimating the model using Bayesian techniques, we find that a single factor explains most of the variation in bond risk premiums. Furthermore, the model-implied risk premiums account for up to 40% of the variability of one- and two-year excess returns. Using the model to decompose yield spreads into an expectations and a term premium component, we find that, although this decomposition does not seem important to forecast economic activity, it is crucial to forecast inflation for most forecasting horizons.
    Keywords: Macro-finance model, Yield curve, Expectations hypothesis
    JEL: E43 E44 E47
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:nbb:reswpp:201403-254&r=for
  6. By: Francis X. Diebold (Department of Economics, University of Pennsylvania); Minchul Shin (Department of Economics, University of Pennsylvania)
    Abstract: We propose and explore several related ways of reducing reliance of point forecast accuracy evaluation on expected loss, E(L(e)), where e is forecast error. Our central approach dispenses with the loss function entirely, instead using a \stochastic error divergence" (SED) accuracy measure based directly on the forecast-error c.d.f., F(e). We explore several variations on the basic theme; interestingly, all point to the primacy of absolute-error loss and its generalizations.
    Keywords: Forecast accuracy, forecast evaluation, absolute-error loss, quadratic loss, squared-error loss
    JEL: C53
    Date: 2014–03–20
    URL: http://d.repec.org/n?u=RePEc:pen:papers:14-011&r=for
  7. By: Christoph Bergmeir; Rob J Hyndman; Jose M Benitez
    Abstract: Exponential smoothing is one of the most popular forecasting methods. We present a method for bootstrap aggregation (bagging) of exponential smoothing methods. The bagging uses a Box-Cox transformation followed by an STL decomposition to separate the time series into trend, seasonal part, and remainder. The remainder is then bootstrapped using a moving block bootstrap, and a new series is assembled using this bootstrapped remainder. On the bootstrapped series, an ensemble of exponential smoothing models is estimated. The resulting point forecasts are averaged using the mean. We evaluate this new method on the M3 data set, showing that it consistently outperforms the original exponential smoothing models. On the monthly data, we achieve better results than any of the original M3 participants. We also perform statistical testing to explore significance of the results. Using the MASE, our method is significantly better than all the M3 participants on the monthly data.
    Keywords: bagging, bootstrapping, exponential smoothing, STL decomposition.
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2014-11&r=for
  8. By: Adnen Ben Nasr (Laboratoire BESTMOD, ISG de Tunis, Universite de Tunis, Tunisia); Thomas Lux (Department of Economics, University of Kiel, Germany and Banco de Espana Chair in Computational Economics, University Jaume I, Castellon, Spain); Ahdi N. Ajmi (College of Science and Humanities in Slayel, Salman bin Abdulaziz University, Kingdom of Saudi Arabia); Rangan Gupta (Department of Economics, University of Pretoria)
    Abstract: The financial crisis has fueled interest in alternatives to traditional asset classes that might be less affected by large market gyrations and, thus, provide for a less volatile development of a portfolio. One attempt at selecting stocks that are less prone to extreme risks, is obeyance of Islamic Sharia rules. In this light, we investigate the statistical properties of the Dow Jones Islamic Finance (DJIM) index and explore its volatility dynamics using a number of up-to-date statistical models allowing for long memory and regime-switching dynamics. We find that the DJIM shares all stylized facts of traditional asset classes, and estimation results and forecasting performance for various volatility models are also in line with prevalent ndings in the literature. Overall, the relatively new Markov-switching multifractal model performs best under the majority of time horizons and loss criteria. Long memory GARCH-type models always improve upon the short-memory GARCH specification and additionally allowing for regime changes can further improve their performance.
    Keywords: Islamic finance, volatility dynamics, long memory, multifractals
    JEL: G15 G17 G23
    Date: 2014–03
    URL: http://d.repec.org/n?u=RePEc:pre:wpaper:201412&r=for

This nep-for issue is ©2014 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.