nep-for New Economics Papers
on Forecasting
Issue of 2010‒10‒02
ten papers chosen by
Rob J Hyndman
Monash University

  1. On the forecasting accuracy of multivariate GARCH models By LAURENT, Sébastien; ROMBOUTS, Jeroen V. K.; VIOLANTE, Francesco
  2. Consensus Forecasts and Inefficient Information Aggregation By Christopher W. Crowe
  3. Nowcasting the Global Economy By James Rossiter
  4. Forecasting Monetary Policy Rules in South Africa By Ivan Paya; Ruthira Naraidoo
  5. Forecasting the Path of USS CO2 Emissions Using State-Level Information By Maximillian Auffhammer; Ralf Steinhauser
  6. Forecasting Key Macroeconomic Variables of the South African Economy: A Small Open Economy New Keynesian DSGE-VAR Model By Rangan Gupta; Rudi Steinbach
  7. Model Selection and Testing of Conditional and Stochastic Volatility Models By Massimiliano Caporin; Michael McAleer
  8. Efficient Evaluation of Multidimensional Time-Varying Density Forecasts with an Application to Risk Management By Evarist Stoja; Arnold Polanski
  9. Time series analysis for financial market meltdowns By Young Shin Kim; Rachev, Svetlozar T.; Bianchi, Michele Leonardo; Mitov, Ivan; Fabozzi, Frank J.
  10. Aggregation of exponential smoothing processes with an application to portfolio risk evaluation By SBRANA, Giacomo; SILVESTRINI, Andrea

  1. By: LAURENT, Sébastien (Maastricht University, The Netherlands; Université catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium); ROMBOUTS, Jeroen V. K. (HEC Montréal, CIRANO, CIRPEE; Université catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium); VIOLANTE, Francesco (Université de Namur, CeReFim, B-5000 Namur, Belgium; Université catholique de Louvain, CORE, B-1348 Louvain-la-Neuve, Belgium)
    Abstract: This paper addresses the question of the selection of multivariate GARCH models in terms of variance matrix forecasting accuracy with a particular focus on relatively large scale problems. We consider 10 assets from NYSE and NASDAQ and compare 125 model based one-step-ahead conditional variance forecasts over a period of 10 years using the model confidence set (MCS) and the Superior Predicitive Ability (SPA) tests. Model per- formances are evaluated using four statistical loss functions which account for different types and degrees of asymmetry with respect to over/under predictions. When consid- ering the full sample, MCS results are strongly driven by short periods of high market instability during which multivariate GARCH models appear to be inaccurate. Over rel- atively unstable periods, i.e. dot-com bubble, the set of superior models is composed of more sophisticated specifications such as orthogonal and dynamic conditional correlation (DCC), both with leverage effect in the conditional variances. However, unlike the DCC models, our results show that the orthogonal specifications tend to underestimate the conditional variance. Over calm periods, a simple assumption like constant conditional correlation and symmetry in the conditional variances cannot be rejected. Finally, during the 2007-2008 financial crisis, accounting for non-stationarity in the conditional variance process generates superior forecasts. The SPA test suggests that, independently from the period, the best models do not provide significantly better forecasts than the DCC model of Engle (2002) with leverage in the conditional variances of the returns.
    Keywords: variance matrix, forecasting, multivariate GARCH, loss function, model confidence set, superior predictive ability
    JEL: C10 C32 C51 C52 C53 G10
    Date: 2010–05–01
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2010025&r=for
  2. By: Christopher W. Crowe
    Abstract: Consensus forecasts are inefficient, over-weighting older information already in the public domain at the expense of new private information, when individual forecasters have different information sets. Using a cross-country panel of growth forecasts and new methodological insights, this paper finds that: consensus forecasts are inefficient as predicted; this is not due to individual forecaster irrationality; forecasters appear unaware of this inefficiency; and a simple adjustment reduces forecast errors by 5 percent. Similar results are found using US nominal GDP forecasts. The paper also discusses the result’s implications for users of forecaster surveys and for the literature on information aggregation.
    Keywords: Cross country analysis , Economic forecasting , Economic growth , Gross domestic product ,
    Date: 2010–07–30
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:10/178&r=for
  3. By: James Rossiter
    Abstract: Forecasts of global economic activity and inflation are important inputs when conducting monetary policy in small open economies such as Canada. As part of the Bank of Canada's broad agenda to augment its short-term forecasting tools, the author constructs simple mixed-frequency forecasting equations for quarterly global output, imports, and inflation using the monthly global Purchasing Managers Index (PMI). When compared against two benchmark models, the results show that the PMIs are useful for forecasting developments in the global economy. As the forecasts are updated throughout the quarter with the monthly release of the PMI, forecasting performance generally improves. An analysis of the forecasts over the period of the Great Recession (in particular, 2008Q4 to 2009Q2) shows that, while models that include the "soft" PMI indicators did not fully capture the sharp deterioration in the global economy, they nevertheless improved the forecasts relative to the benchmark models. This finding highlights the usefulness of such indicators for short-term forecasting.
    Keywords: Economic models; International topics
    JEL: E37 F47
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:bca:bocadp:10-12&r=for
  4. By: Ivan Paya; Ruthira Naraidoo
    Abstract: This paper is the .rst one to: (i) provide in-sample estimates of linear and nonlinear Taylor rules augmented with an indicator of .nancial stability for the case of South Africa, (ii) analyse the ability of linear and nonlinear monetary policy rule speci.cations as well as nonparametric and semiparametric models in forecasting the nominal interest rate setting that describes the South African Reserve Bank (SARB) policy decisions. Our results indicate, .rst, that asset prices are taken into account when setting interest rates; second, the existence of nonlinearities in the monetary policy rule; and third, forecasts constructed from combinations of all models perform particularly well and that there are gains from semiparametric models in forecasting the interest rates as the forecasting horizon lengthens.
    Keywords: Taylor rules, nonlinearity, nonparametric, semiparametric, forecasting
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:lan:wpaper:006841&r=for
  5. By: Maximillian Auffhammer; Ralf Steinhauser
    Abstract: In this paper we compare the most common reduced form models used for emissions forecasting, point out shortcomings and suggest improvements. Using a U.S. state level panel data set of CO2 emissions we test the performance of existing models against a large universe of potential reduced form models. Our preferred measure of model performance is the squared out-of-sample prediction error of aggregate CO2 emissions. We find that leading models in the literature, as well as models selected based on an emissions per capita loss measure or different in-sample selection criteria, perform significantly worse compared to the best model chosen based directly on the out-of-sample loss measure defined over aggregate emissions. Unlike the existing literature, the tests of model superiority employed here account for model search or ‘data snooping’ involved in identifying a preferred model. Forecasts from our best performing model for the United States are 100 million tons of carbon lower than existing scenarios predict.
    JEL: Q43 C53
    Date: 2010–08
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2010-526&r=for
  6. By: Rangan Gupta (Department of Economics, University of Pretoria); Rudi Steinbach (South African Reserve Bank, Pretoria)
    Abstract: The paper develops a Small Open Economy New Keynesian DSGE-VAR (SOENKDSGEVAR) model of the South African economy, characterised by incomplete pass-through of exchange rate changes, external habit formation, partial indexation of domestic prices and wages to past inflation, and staggered price and wage setting. The model is estimated using Bayesian techniques on data for South Africa and the United States (US) from the period 1990Q1 to 2003Q2, and then used to forecast output growth, inflation and a measure of nominal short-term interest rate for one- to eight-quarters-ahead over an out-ofsample horizon of 2003Q3 to 2008Q4. The forecast performance of the SOENKDSGEVAR model is then compared with an independently estimated DSGE model, the classical VAR and BVAR models, with the latter being estimated based on six alternative priors, namely, Non-Informative and Informative Natural Conjugate priors, the Minnesota prior, Independent Normal-Wishart Prior, Stochastic Search Variable Selection (SSVS) prior on VAR coefficients and SSVS prior on both VAR coefficients and error covariance. Overall, we can draw the following conclusions: First, barring the BVAR model based on the SSVS prior on both VAR coefficients and the error covariance, the SOENKDSGE-VAR model is found to perform competitively, if not, better than all the other VAR models for most of the one- to eight-quarters-ahead forecasts. Second, there is no significant gain in forecasting performance by moving to a DSGE-VAR framework when compared to an independently estimated SOENKDSGE model. Finally, there is overwhelming evidence that the BVAR model based on the SSVS prior on both VAR coefficients and the error covariance is the best-suited model in forecasting the three variables of interest.
    Keywords: Bayesian Methods; Macroeconomic Forecasting; New Keynesian DSGE; Small Open Economy; Vector Autoregressions
    JEL: C11 C53 E37
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:pre:wpaper:201019&r=for
  7. By: Massimiliano Caporin (Department of Economics and Management "Marco Fanno", University of Padova); Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, and Institute of Economic Research, Kyoto University)
    Abstract: This paper focuses on the selection and comparison of alternative non-nested volatility models. We review the traditional in-sample methods commonly applied in the volatility framework, namely diagnostic checking procedures, information criteria, and conditions for the existence of moments and asymptotic theory, as well as the out-of-sample model selection approaches, such as mean squared error and Model Confidence Set approaches. The paper develops some innovative loss functions which are based on Value-at-Risk forecasts. Finally, we present an empirical application based on simple univariate volatility models, namely GARCH, GJR, EGARCH, and Stochastic Volatility that are widely used to capture asymmetry and leverage.
    Keywords: Volatility model selection, volatility model comparison, non-nested models, model confidence set, Value-at-Risk forecasts, asymmetry, leverage
    JEL: C11 C22 C52
    Date: 2010–09
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:724&r=for
  8. By: Evarist Stoja; Arnold Polanski
    Abstract: We propose two simple evaluation methods for time varying density forecasts of continuous higher dimensional random variables. Both methods are based on the probability integral transformation for unidimensional forecasts. The first method tests multinormal densities and relies on the rotation of the coordinate system. The advantage of the second method is not only its applicability to any continuous distribution but also the evaluation of the forecast accuracy in specific regions of its domain as defined by the user’s interest. We show that the latter property is particularly useful for evaluating a multidimensional generalization of the Value at Risk. In simulations and in an empirical study, we examine the performance of both tests.
    Keywords: Multivariate Density Forecast Evaluation, Probability Integral Transformation, Multidimensional Value at Risk, Monte Carlo Simulations
    JEL: C52 C53
    Date: 2009–12
    URL: http://d.repec.org/n?u=RePEc:bri:uobdis:09/617&r=for
  9. By: Young Shin Kim; Rachev, Svetlozar T.; Bianchi, Michele Leonardo; Mitov, Ivan; Fabozzi, Frank J.
    Abstract: There appears to be a consensus that the recent instability in global financial markets may be attributable in part to the failure of financial modeling. More specifically, current risk models have failed to properly assess the risks associated with large adverse stock price behavior. In this paper, we first discuss the limitations of classical time series models for forecasting financial market meltdowns. Then we set forth a framework capable of forecasting both extreme events and highly volatile markets. Based on the empirical evidence presented in this paper, our framework offers an improvement over prevailing models for evaluating stock market risk exposure during distressed market periods. --
    Keywords: ARMA-GARCH model,»-stable distribution,tempered stable distribution,value-at-risk (VaR),average value-at-risk (AVaR)
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:zbw:kitwps:2&r=for
  10. By: SBRANA, Giacomo (Université de Strasbourg, BETA, F-67085 Strasbourg, France); SILVESTRINI, Andrea (Bank of Italy, Economics, Research and International Relations Area, Economic and Financial Statistics Department, I-00184 Roma, Italy)
    Abstract: In this paper we propose a unified framework to analyse contemporaneous and temporal aggregation of exponential smoothing (EWMA) models. Focusing on a vector IMA(1,1) model, we obtain a closed form representation for the parameters of the contemporaneously and temporally aggregated process as a function of the parameters of the original one. In the framework of EWMA estimates of volatility, we present an application dealing with Value-at-Risk (VaR) prediction at different sampling frequencies for an equally weighted portfolio composed of multiple indices. We apply the aggregation results by inferring the decay factor in the portfolio volatility equation from the estimated vector IMA(1,1) model of squared returns. Empirical results show that VaR predictions delivered using this suggested approach are at least as accurate as those obtained by applying the standard univariate RiskMetrics TM methodology.
    Keywords: contemporaneous and temporal aggregation, EWMA, volatility, Value-at-Risk
    JEL: C10 C32 C43
    Date: 2010–07–01
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2010039&r=for

This nep-for issue is ©2010 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.