nep-for New Economics Papers
on Forecasting
Issue of 2013‒05‒22
ten papers chosen by
Rob J Hyndman
Monash University

  1. Forecast Evaluations for Multiple Time Series: A Generalized Theil Decomposition By Wolfgang Polasek
  2. Evaluating Predictive Densities of U.S. Output Growth and Inflation in a Large Macroeconomic Data Set By Barbara Rossi; Tatevik Sehkposyan
  3. Exchange Rate Predictability By Barbara Rossi
  4. Bayesian Forecasting with a Factor-Augmented Vector Autoregressive DSGE model By Stelios D. Bekiros; Alessia Paccagnini
  5. Conditional Predictive Density Evaluation in the Presence of Instabilities By Barbara Rossi; Tatevik Sehkposyan
  6. Inflation fan charts, monetary policy and skew normal distribution By Wojciech Charemza; Carlos Diaz Vela; Svetlana Makarova
  7. Mortgage Default during the U.S. Mortgage Crisis By Thomas Schelkle
  8. Does Inflation Walk on Unstable Paths? Rational Sunspots and Drifting Parameters By Paolo Bonomolo; Guido Ascari
  9. Adaptive functional linear regression. By Comte, Fabienne
  10. Using Bagidis in nonparametric functional data analysis: Predicting from curves with sharp local features. By Timmermans, Catherine

  1. By: Wolfgang Polasek (Institute of Advanced Studies, Austria)
    Abstract: The mean square error (MSE) compares point forecasts or a location parameter of the forecasting distribution with actual observations by the quadratic loss criterion. This paper shows how the Theil decomposition of the MSE error into a bias, variance and noise component which was proposed for univariate time series can be used to evaluate and compare multiple time series forecasts. Thus, for multivariate time series the ordinary and the alternative Theil decomposition is applied to decompose the MSE matrix. As an alternative we propose the average predictive ordinate criterion (APOC) which evaluates the ordinates of the predictive distribution for comparing forecasts of volatile time series. The multivariate Theil decomposition for the MSE and APOC criterion is used to compare and evaluate 3-dimensional VAR-GARCH-M time series forecasts for stock indices and exchange rates.
    Keywords: Forecast comparisons, average predictive ordinate criterion APOC, MSE matrix and multivariate predictions, multivariate and alternative Theil decomposition
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:23_13&r=for
  2. By: Barbara Rossi; Tatevik Sehkposyan
    Abstract: We evaluate conditional predictive densities for U.S. output growth and inflation using a number of commonly used forecasting models that rely on a large number of macroeconomic predictors. More specifically, we evaluate how well conditional predictive densities based on the commonly used normality assumption fit actual realizations out-of-sample. Our focus on predictive densities acknowledges the possibility that, although some predictors can improve or deteriorate point forecasts, they might have the opposite effect on higher moments. We find that normality is rejected for most models in some dimension according to at least one of the tests we use. Interestingly, however, combinations of predictive densities appear to be correctly approximated by a normal density: the simple, equal average when predicting output growth and Bayesian model average when predicting inflation.
    Keywords: predictive density evaluation, structural change, output growth forecasts, inflation forecasts
    JEL: C22 C52 C53
    Date: 2013–02
    URL: http://d.repec.org/n?u=RePEc:bge:wpaper:689&r=for
  3. By: Barbara Rossi
    Abstract: The main goal of this article is to provide an answer to the question: “Does anything forecast exchange rates, and if so, which variables?†It is well known that exchange rate fluctuations are very difficult to predict using economic models, and that a random walk forecasts exchange rates better than any economic model (the Meese and Rogoff puzzle). However, the recent literature has identified a series of fundamentals/methodologies that claim to have resolved the puzzle. This article provides a critical review of the recent literature on exchange rate forecasting and illustrates the new methodologies and fundamentals that have been recently proposed in an up- to-date, thorough empirical analysis. Overall, our analysis of the literature and the data suggests that the answer to the question: "Are exchange rates predictable?" is, "It depends" –on the choice of predictor, forecast horizon, sample period, model, and forecast evaluation method. Predictability is most apparent when one or more of the following hold: the predictors are Taylor rule or net foreign assets, the model is linear, and a small number of parameters are estimated. The toughest benchmark is the random walk without drift.
    Keywords: exchange rates, forecasting, instability, forecast evaluation
    JEL: F3 C5
    Date: 2013–02
    URL: http://d.repec.org/n?u=RePEc:bge:wpaper:690&r=for
  4. By: Stelios D. Bekiros (Department of Economics, European University Institute (EUI) and Rimini Centre for Economic Analysis (RCEA), Italy); Alessia Paccagnini (Department of Economics, Università degli Studi di Milano-Bicocca, Italy)
    Abstract: In this paper we employ advanced Bayesian methods in estimating dynamic stochastic general equilibrium (DSGE) models. Although policymakers and practitioners are particularly interested in DSGE models, these are typically too stylized to be taken directly to the data and often yield weak prediction results. Very recently, hybrid models have become popular for dealing with some of the DSGE model misspecifications. Major advances in Bayesian estimation methodology could allow these models to outperform well-known time series models and effectively deal with more complex real-world problems as richer sources of data become available. This study includes a comparative evaluation of the out-of-sample predictive performance of many different specifications of estimated DSGE models and various classes of VAR models, using datasets from the US economy. Simple and hybrid DSGE models are implemented, such as DSGE-VAR and tested against standard, Bayesian and Factor Augmented VARs. In this study we focus on a Factor Augmented DSGE model that is estimated using Bayesian approaches. The investigated period spans 1960:Q4 to 2010:Q4 for the real GDP, the harmonized CPI and the nominal short-term interest rate. We produce their forecasts for the out-of-sample testing period 1997:Q1-2010:Q4. This comparative validation can be useful to monetary policy analysis and macro-forecasting with the use of advanced Bayesian methods.
    Keywords: Bayesian estimation, Forecasting, Metropolis-Hastings, Markov chain monte carlo, Marginal data density, Factor Augmented DSGE
    JEL: C11 C15 C32
    Date: 2013–04
    URL: http://d.repec.org/n?u=RePEc:rim:rimwps:22_13&r=for
  5. By: Barbara Rossi; Tatevik Sehkposyan
    Abstract: We propose new methods for evaluating predictive densities. The methods include Kolmogorov-Smirnov and Cramer-von Mises-type tests for the correct specification of predictive densities robust to dynamic mis-specification. The novelty is that the tests can detect mis-specification in the predictive densities even if it appears only over a fraction of the sample, due to the presence of instabilities. Our results indicate that our tests are well sized and have good power in detecting mis-specification in predictive densities, even when it is time-varying. An application to density forecasts of the Survey of Professional Forecasters demonstrates the usefulness of the proposed methodologies.
    Keywords: predictive density, dynamic mis-specification, instability, structural change, forecast evaluation
    JEL: C22 C52 C53
    Date: 2013–02
    URL: http://d.repec.org/n?u=RePEc:bge:wpaper:688&r=for
  6. By: Wojciech Charemza; Carlos Diaz Vela; Svetlana Makarova
    Abstract: Issues related to classification, interpretation and estimation of inflationary uncertainties are addressed in the context of their application for constructing probability forecasts of inflation. It is shown that confusions in defining uncertainties lead to potential misunderstandings of such forecasts. The principal source of such confusion is in ignoring the effect of feedback from the policy action undertaken on the basis of forecasts of inflation onto uncertainties. In order to resolve this problem a new class of skew normal distributions (weighted skew normal, WSN) have been proposed and its properties derived. It is shown that parameters of WSN distribution can be interpreted in relation to the monetary policy strength and symmetry. It has been fitted to empirical distributions of inflation multi-step forecast errors of inflation for 34 countries, alongside others distributions already existing in the literature. The estimation method applied is using the minimum distance criteria between the empirical and theoretical distributions. Results lead to some constructive conclusions regarding the strength and asymmetry of monetary policy and confirm the applicability of WSN to producing probabilistic forecasts of inflation.
    Keywords: inflation forecasting; uncertainty; monetary policy; non-normality
    JEL: C54 E37 E52
    Date: 2013–05
    URL: http://d.repec.org/n?u=RePEc:lec:leecon:13/06&r=for
  7. By: Thomas Schelkle (London School of Economics)
    Abstract: This paper asks which theories of mortgage default are quantitatively consistent with observations in the United States during 2002-2010. Theoretical models are simulated for the observed time-series of aggregate house prices. Their predictions are then compared to actual default rates on prime fixed-rate mortgages. An out-of-sample test discriminates between estimated reduced forms of the two most prominent theories. The test reveals that the double-trigger hypothesis attributing mortgage default to the joint occurrence of negative equity and a life event like unemployment outperforms a frictionless option-theoretic default model. Based on this finding a structural partial-equilibrium model with liquidity constraints and idiosyncratic unemployment shocks is presented to provide micro-foundations for the double-trigger hypothesis. In this model borrowers with negative equity are more likely to default when they are unemployed and have low liquid wealth. The model explains most of the observed strong rise in mortgage default rates. A policy implication of the model is that subsidizing homeowners can mitigate a mortgage crisis at a lower cost than bailing out lenders.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:red:sed012:751&r=for
  8. By: Paolo Bonomolo (University of Pavia); Guido Ascari (Universita degli Studi di Pavia)
    Abstract: We propose a generalization of the rational expectations (RE) hypothesis: as in the original approach by Muth (1961), the case of multiple solutions is the natural case, and expectations are formed by randomizing across the infinite RE solutions. We call our approach: "rational sunspots". The infinite solutions differ in the way agents form their expectations, or more precisely in the way agents weight past data to make forecasts. It follows that our approach naturally yields drifting parameters and stochastic volatility. It also allows for the possibility of temporary explosive paths. Moreover, a simple method to distinguish between determinacy and indeterminacy is based on the Normality of the likelihood.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:red:sed012:743&r=for
  9. By: Comte, Fabienne
    Abstract: We consider the estimation of the slope function in functional linear regression, where scalar responses are modeled in dependence of random functions. Cardot and Johannes [J. Multivariate Anal. 101 (2010) 395–408] have shown that a thresholded projection estimator can attain up to a constant minimax-rates of convergence in a general framework which allows us to cover the prediction problem with respect to the mean squared prediction error as well as the estimation of the slope function and its derivatives. This estimation procedure, however, requires an optimal choice of a tuning parameter with regard to certain characteristics of the slope function and the covariance operator associated with the functional regressor. As this information is usually inaccessible in practice, we investigate a fully data-driven choice of the tuning parameter which combines model selection and Lepski’s method. It is inspired by the recent work of Goldenshluger and Lepski [Ann. Statist. 39 (2011) 1608–1632]. The tuning parameter is selected as minimizer of a stochastic penalized contrast function imitating Lepski’s method among a random collection of admissible values. This choice of the tuning parameter depends only on the data and we show that within the general framework the resulting data-driven thresholded projection estimator can attain minimaxrates up to a constant over a variety of classes of slope functions and covariance operators. The results are illustrated considering different configurations which cover in particular the prediction problem as well as the estimation of the slope and its derivatives. A simulation study shows the reasonable performance of the fully data-driven estimation procedure.
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ner:louvai:info:hdl:2078.1/127327&r=for
  10. By: Timmermans, Catherine
    Abstract: Our goal is to predict a scalar value or a group membership from the discretized observation of curves with sharp local features that might vary both vertically and horizontally. To this aim, we propose to combine the use of the nonparametric functional regression estimator developed by Ferraty and Vieu (2006) [18] with the Bagidis semimetric developed by Timmermans and von Sachs (submitted for publication) [36] with a view of efficiently measuring dissimilarities between curves with sharp patterns. This association is revealed as powerful. Under quite general conditions, we first obtain an asymptotic expansion for the small ball probability indicating that Bagidis induces a fractal topology on the functional space. We then provide the rate of convergence of the nonparametric regression estimator in this case, as a function of the parameters of the Bagidis semimetric. We propose to optimize those parameters using a cross-validation procedure, and show the optimality of the selected vector. This last result has a larger scope and concerns the optimization of any vector parameter characterizing a semimetric used in this context. The performances of our methodology are assessed on simulated and real data examples. Results are shown to be superior to those obtained using competing semimetrics as soon as the variations of the significant sharp patterns in the curves have a horizontal component.
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:ner:louvai:info:hdl:2078.1/118369&r=for

This nep-for issue is ©2013 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.