nep-for New Economics Papers
on Forecasting
Issue of 2011‒11‒14
twelve papers chosen by
Rob J Hyndman
Monash University

  1. Forecasting GDP growth in times of crisis: private sector forecasts versus statistical models By Jasper de Winter
  2. Evaluating density forecasts: model combination strategies versus the RBNZ By Chris McDonald; Leif Anders Thorsrud
  3. An Application of the Disequilibrium Adjustment Framework to Small Area Forecasting and Impact Analysis By Geoffrey Hewings; Jae Hong Kim
  4. SAFE: An early warning system for systemic banking risk By Mikhail V. Oet; Ryan Eiben; Timothy Bianco; Dieter Gramlich; Stephen J. Ong; Jing Wang
  5. Nowcasting GDP in real-time: A density combination approach By Knut Are Aastveit; Karsten R. Gerdrup; Anne Sofie Jore; Leif Anders Thorsrud
  6. Estimating and Forecasting with a Dynamic Spatial Panel Data Model By Badi H. Baltagi; Bernard Fingleton; Alain Pirotte
  7. Prediction intervals in conditionally heteroscedastic time series with stochastic components. By Espasa, Antoni; Pellegrini, Santiago; Ruiz, Esther
  8. The Rise and Fall of S&P500 Variance Futures By Chia-Lin Chang; Juan-Ángel Jiménez-Martín; Michael McAleer; Teodosio Pérez-Amaral
  9. Capital flows and Japanese asset volatility By Christopher J. Neely; Brett W. Fawley
  10. Bayesian Estimation of Generalized Hyperbolic Skewed Student GARCH Models By Philippe J. Deschamps
  11. Privileged information exacerbates market volatility. By Gabriel Desgranges; Stéphane Gauthier
  12. Predicting failure in the commercial banking industry By Tatom, John

  1. By: Jasper de Winter
    Abstract: This paper examines the accuracy of short run forecasts of Dutch GDP growth by several linear statistical models and private sector analysts. We focus on the financial crisis of 2008-2009 and the dot-com recession of 2001-2002. The dynamic factor model turns out to be the best model. Its forecast accuracy during the crisis deteriorates much less than that of the other linear models and hardly at all when backcasting and nowcasting. Moreover, the dynamic factor model beats the private sector forecasters at nowcasting. This finding suggests that adding judgement to a mechanical model may not improve short-term forecasting performance.
    Keywords: Nowcasting; Professional Forecasters; Factor Model; Forecasting
    JEL: E52 C53 C33
    Date: 2011–11
    URL: http://d.repec.org/n?u=RePEc:dnb:dnbwpp:320&r=for
  2. By: Chris McDonald; Leif Anders Thorsrud (Reserve Bank of New Zealand)
    Abstract: Forecasting the future path of the economy is essential for good monetary policy decisions. The recent financial crisis has highlighted the importance of tail events, and that assessing the central projection is not enough. The whole range of outcomes should be forecasted, evaluated and accounted for when making monetary policy decisions. As such, we construct density fore- casts using the historical performance of the Reserve Bank of New Zealand's (RBNZ) published point forecasts. We compare these implied RBNZ den- sities to similarly constructed densities from a suite of empirical models. In particular, we compare the implied RBNZ densities to combinations of density forecasts from the models. Our results reveal that the combined den- sities are comparable in performance and sometimes better than the implied RBNZ densities across many dierent horizons and variables. We also find that the combination strategies typically perform better than relying on the best model in real-time, that is the selection strategy.
    JEL: C52 C53 E52
    Date: 2011–08
    URL: http://d.repec.org/n?u=RePEc:nzb:nzbdps:2011/03&r=for
  3. By: Geoffrey Hewings; Jae Hong Kim
    Abstract: The disequilibrium adjustment frameworks, pioneered by Carlino & Mills (1987) and further extended by Boarnet (1994a), have been widely adopted by various regional and intra-regional studies, 1) determining whether jobs follow people or people follow jobs or the both; 2) examining the determinants of growth or location decisions; and 3) investigating spread versus backwash effects. Beyond these traditional uses of the framework, this chapter presents an idea of using the model for small area population and employment forecasting and impact analysis. An application using data for the Chicago metropolitan area reveals that the framework, capturing spatial population-employment interaction and adjustment processes, can be a powerful small area forecasting and impact analysis tool, when it is combined with a regional economic forecasting method. Particularly, the spatial econometric specification of the model facilitates the integration of horizontal (across spatial units) as well as vertical (over the hierarchy; macro and sub-regional) dimensions to the analysis of change. This study also discusses some theoretical issues and methodological challenges in this type of application. Keywords: Small-areas Forecasting, Spatial Adjustment, Econometric Input-Output Model.
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:wiw:wiwrsa:ersa11p1839&r=for
  4. By: Mikhail V. Oet; Ryan Eiben; Timothy Bianco; Dieter Gramlich; Stephen J. Ong; Jing Wang
    Abstract: This paper builds on existing microprudential and macroprudential early warning systems (EWSs) to develop a new, hybrid class of models for systemic risk, incorporating the structural characteristics of the fi nancial system and a feedback amplification mechanism. The models explain fi nancial stress using both public and proprietary supervisory data from systemically important institutions, regressing institutional imbalances using an optimal lag method. The Systemic Assessment of Financial Environment (SAFE) EWS monitors microprudential information from the largest bank holding companies to anticipate the buildup of macroeconomic stresses in the financial markets. To mitigate inherent uncertainty, SAFE develops a set of medium-term forecasting specifi cations that gives policymakers enough time to take ex-ante policy action and a set of short-term forecasting specifications for verification and adjustment of supervisory actions. This paper highlights the application of these models to stress testing, scenario analysis, and policy.
    Keywords: Systemic risk ; Liquidity (Economics)
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:fip:fedcwp:1129&r=for
  5. By: Knut Are Aastveit (Norges Bank (Central Bank of Norway)); Karsten R. Gerdrup (Norges Bank (Central Bank of Norway)); Anne Sofie Jore (Norges Bank (Central Bank of Norway)); Leif Anders Thorsrud (BI Norwegian Business School and Norges Bank (Central Bank of Norway))
    Abstract: In this paper we use U.S. real-time vintage data and produce combined density nowcasts for quarterly GDP growth from a system of three commonly used model classes. The density nowcasts are combined in two steps. First, a wide selection of individual models within each model class are combined separately. Then, the nowcasts from the three model classes are combined into a single predictive density. We update the density nowcast for every new data release throughout the quarter, and highlight the importance of new information for the evaluation period 1990Q2-2010Q3. Our results show that the logarithmic score of the predictive densities for U.S. GDP increase almost monotonically as new information arrives during the quarter. While the best performing model class is changing during the quarter, the density nowcasts from our combination framework is always performing well both in terms of logarithmic scores and calibration tests. The density combination approach is superior to a simple model selection strategy and also performs better in terms of point forecast evaluation than standard point forecast combinations.
    Keywords: Density combination, Forecast densities, Forecast evaluation, Monetary policy, Nowcasting; Real-time data
    JEL: C32 C52 E37 E52
    Date: 2011–09–28
    URL: http://d.repec.org/n?u=RePEc:bno:worpap:2011_11&r=for
  6. By: Badi H. Baltagi; Bernard Fingleton; Alain Pirotte
    Abstract: This paper focuses on the estimation and predictive performance of several estimators for the dynamic and autoregressive spatial lag panel data model with spatially correlated disturbances. In the spirit of Arellano and Bond (1991) and Mutl (2006), a dynamic spatial GMM estimator is proposed based on Kapoor, Kelejian and Prucha (2007) for the Spatial AutoRegressive (SAR) error model. The main idea is to mix non-spatial and spatial instruments to obtain consistent estimates of the parameters. Then, a linear predictor of this spatial dynamic model is derived. Using Monte Carlo simulations, we compare the performance of the GMM spatial estimator to that of spatial and non-spatial estimators and illustrate our approach with an application to new economic geography.
    Keywords: Panel data, spatial lag, error components, linear predictor, GMM, spatialautocorrelation
    JEL: C33
    Date: 2011–11
    URL: http://d.repec.org/n?u=RePEc:cep:sercdp:0095&r=for
  7. By: Espasa, Antoni; Pellegrini, Santiago; Ruiz, Esther
    Abstract: Differencing is a very popular stationary transformation for series with stochastic trends. Moreover, when the differenced series is heteroscedastic, authors commonly model it using an ARMA-GARCH model. The corresponding ARIMA-GARCH model is then used to forecast future values of the original series. However, the heteroscedasticity observed in the stationary transformation should be generated by the transitory and/or the long-run component of the original data. In the former case, the shocks to the variance are transitory and the prediction intervals should converge to homoscedastic intervals with the prediction horizon.We show that, in this case, the prediction intervals constructed from the ARIMA-GARCH models could be inadequate because they never converge to homoscedastic intervals. All of the results are illustrated using simulated and real time series with stochastic levels.
    Keywords: ARIMA-GARCH models; Local level model; Nonlinear time series; State space models; Unobserved component models;
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ner:carlos:info:hdl:10016/12257&r=for
  8. By: Chia-Lin Chang (Department of Applied Economics Department of Finance National Chung Hsing University Taichung, Taiwan); Juan-Ángel Jiménez-Martín (Department of Quantitative Economics Complutense University of Madrid); Michael McAleer (Erasmus University Rotterdam, Tinbergen Institute, The Netherlands, Complutense University of Madrid, and Institute of Economic Research, Kyoto University); Teodosio Pérez-Amaral (Department of Quantitative Economics Complutense University of Madrid)
    Abstract: Modelling, monitoring and forecasting volatility are indispensible to sensible portfolio risk management. The volatility of an asset of composite index can be traded by using volatility derivatives, such as volatility and variance swaps, options and futures. The most popular volatility index is VIX, which is a key measure of market expectations of volatility, and hence also an important barometer of investor sentiment and market volatility. Investors interpret the VIX cash index as a “fear” index, and of VIX options and VIX futures as derivatives of the “fear” index. VIX is based on S&P500 call and put options over a wide range of strike prices, and hence is not model based. Speculators can trade on volatility risk with VIX derivatives, with views on whether volatility will increase or decrease in the future, while hedgers can use volatility derivatives to avoid exposure to volatility risk. VIX and its options and futures derivatives has been widely analysed in recent years. An alternative volatility derivative to VIX is the S&P500 variance futures, which is an expectation of the variance of the S&P500 cash index. Variance futures are futures contracts written on realized variance, or standardized variance swaps. The S&P500 variance futures are not model based, so the assumptions underlying the index do not seem to have been clearly understood. As variance futures are typically thinly traded, their returns and volatility are not easy to model accurately using a variety of model specifications. This paper analyses the volatility in S&P500 3-month variance futures before, during and after the GFC, as well as for the full data period, for each of three alternative conditional volatility models and three densities, in order to determine whether exposure to risk can be incorporated into a financial portfolio without taking positions on the S&P500 index itself.
    Keywords: Risk management, financial derivatives, futures, options, swaps, 3-month variance futures, 12-month variance futures, risk exposure, volatility.
    JEL: C22 G32
    Date: 2011–11
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:795&r=for
  9. By: Christopher J. Neely; Brett W. Fawley
    Abstract: Characterizing asset price volatility is an important goal for financial economists. The literature has shown that variables that proxy for the information arrival process can help explain and/or forecast volatility. Unfortunately, however, obtaining good measures of volume and/or order flow is expensive or difficult in decentralized markets such as foreign exchange. We investigate the extent that Japanese capital flows—which are released weekly—reflect information arrival that improves foreign exchange and equity volatility forecasts. We find that capital flows can help explain transitory shocks to GARCH volatility. Transactions by Japanese residents in foreign bond markets have the most explanatory power among capital flows and that power is much greater in the second subsample.
    Keywords: Capital movements ; Foreign exchange ; Japan
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2011-034&r=for
  10. By: Philippe J. Deschamps (Department of Quantitative Economics)
    Abstract: Efficient posterior simulators for two GARCH models with generalized hyperbolic disturbances are presented. The first model, GHt-GARCH, is a threshold GARCH with a skewed and heavy-tailed error distribution; in this model, the latent variables that account for skewness and heavy tails are identically and independently distributed. The second model, ODLV-GARCH, is formulated in terms of observation-driven latent variables; it automatically incorporates a risk premium effect. Both models nest the ordinary threshold t-GARCH as a limiting case. The GHt-GARCH and ODLV-GARCH models are compared with each other and with the threshold t-GARCH using five publicly available asset return data sets, by means of Bayes factors, information criteria, and classical forecast evaluation tools. The GHt-GARCH and ODLV-GARCH models both strongly dominate the threshold t-GARCH, and the Bayes factors generally favor GHt-GARCH over ODLV-GARCH. A Markov switching extension of GHt-GARCH is also presented. This extension is found to be an empirical improvement over the single-regime model for one of the five data sets.
    Keywords: Autoregressive conditional heteroskedasticity; Markov chain Monte Carlo; bridge sampling; heavy-tailed skewed distributions; generalized hyperbolic distribution; generalized inverse Gaussian distribution
    JEL: C11 C16 C53
    Date: 2011–10–28
    URL: http://d.repec.org/n?u=RePEc:fri:dqewps:wp0016&r=for
  11. By: Gabriel Desgranges (THEMA - Université de Cergy-Pontoise); Stéphane Gauthier (Centre d'Economie de la Sorbonne - Paris School of Economics)
    Abstract: We study how asymmetric information affects market volatility in a linear setup where the outcome is determined by forecasts about this same outcome. The unique rational expectations equilibrium will be stable when it is the only rationalizable solution. It has been established in the literature that stability is obtained when the sensitivity of the outcome to agents' forecasts is less than 1, provided that this sensitivity is common knowledge. Relaxing this common knowledge assumption, instability is obtained when the proportion of agents who a priori know the sensitivity is large, and the uninformed agents believe it is possible that the sensitivity is greater than 1.
    Keywords: Asymmetric information, common knowledge, eductive learning, rational expectations, rationalizability, volatility.
    JEL: C62 D82 D84
    Date: 2011–10
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:11061&r=for
  12. By: Tatom, John
    Abstract: The ability to predict bank failure has become much more important since the mortgage foreclosure crisis began in 2007. The model proposed in this study uses proxies for the regulatory standards embodied in the so-called CAMELS rating system, as well as several local or national economic variables to produce a model that is robust enough to forecast bank failure for the entire commercial bank industry in the United States. This model is able to predict failure (survival) accurately for commercial banks during both the Savings and Loan crisis and the mortgage foreclosure crisis. Other important results include the insignificance of several factors proposed in the literature, including total assets, real price of energy, currency ratio and the interest rate spread.
    Keywords: bank failure; banking crises; CAMELS ratings
    JEL: G18 G33 G21
    Date: 2011–08–05
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:34608&r=for

This nep-for issue is ©2011 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.