
on Forecasting 
By:  Wolfgang Polasek (Institute of Advanced Studies, Austria) 
Abstract:  The mean square error (MSE) compares point forecasts or a location parameter of the forecasting distribution with actual observations by the quadratic loss criterion. This paper shows how the Theil decomposition of the MSE error into a bias, variance and noise component which was proposed for univariate time series can be used to evaluate and compare multiple time series forecasts. Thus, for multivariate time series the ordinary and the alternative Theil decomposition is applied to decompose the MSE matrix. As an alternative we propose the average predictive ordinate criterion (APOC) which evaluates the ordinates of the predictive distribution for comparing forecasts of volatile time series. The multivariate Theil decomposition for the MSE and APOC criterion is used to compare and evaluate 3dimensional VARGARCHM time series forecasts for stock indices and exchange rates. 
Keywords:  Forecast comparisons, average predictive ordinate criterion APOC, MSE matrix and multivariate predictions, multivariate and alternative Theil decomposition 
Date:  2013–05 
URL:  http://d.repec.org/n?u=RePEc:rim:rimwps:23_13&r=for 
By:  Barbara Rossi; Tatevik Sehkposyan 
Abstract:  We evaluate conditional predictive densities for U.S. output growth and inflation using a number of commonly used forecasting models that rely on a large number of macroeconomic predictors. More specifically, we evaluate how well conditional predictive densities based on the commonly used normality assumption fit actual realizations outofsample. Our focus on predictive densities acknowledges the possibility that, although some predictors can improve or deteriorate point forecasts, they might have the opposite effect on higher moments. We find that normality is rejected for most models in some dimension according to at least one of the tests we use. Interestingly, however, combinations of predictive densities appear to be correctly approximated by a normal density: the simple, equal average when predicting output growth and Bayesian model average when predicting inflation. 
Keywords:  predictive density evaluation, structural change, output growth forecasts, inflation forecasts 
JEL:  C22 C52 C53 
Date:  2013–02 
URL:  http://d.repec.org/n?u=RePEc:bge:wpaper:689&r=for 
By:  Barbara Rossi 
Abstract:  The main goal of this article is to provide an answer to the question: â€œDoes anything forecast exchange rates, and if so, which variables?â€ It is well known that exchange rate fluctuations are very difficult to predict using economic models, and that a random walk forecasts exchange rates better than any economic model (the Meese and Rogoff puzzle). However, the recent literature has identified a series of fundamentals/methodologies that claim to have resolved the puzzle. This article provides a critical review of the recent literature on exchange rate forecasting and illustrates the new methodologies and fundamentals that have been recently proposed in an up todate, thorough empirical analysis. Overall, our analysis of the literature and the data suggests that the answer to the question: "Are exchange rates predictable?" is, "It depends" â€“on the choice of predictor, forecast horizon, sample period, model, and forecast evaluation method. Predictability is most apparent when one or more of the following hold: the predictors are Taylor rule or net foreign assets, the model is linear, and a small number of parameters are estimated. The toughest benchmark is the random walk without drift. 
Keywords:  exchange rates, forecasting, instability, forecast evaluation 
JEL:  F3 C5 
Date:  2013–02 
URL:  http://d.repec.org/n?u=RePEc:bge:wpaper:690&r=for 
By:  Stelios D. Bekiros (Department of Economics, European University Institute (EUI) and Rimini Centre for Economic Analysis (RCEA), Italy); Alessia Paccagnini (Department of Economics, Università degli Studi di MilanoBicocca, Italy) 
Abstract:  In this paper we employ advanced Bayesian methods in estimating dynamic stochastic general equilibrium (DSGE) models. Although policymakers and practitioners are particularly interested in DSGE models, these are typically too stylized to be taken directly to the data and often yield weak prediction results. Very recently, hybrid models have become popular for dealing with some of the DSGE model misspecifications. Major advances in Bayesian estimation methodology could allow these models to outperform wellknown time series models and effectively deal with more complex realworld problems as richer sources of data become available. This study includes a comparative evaluation of the outofsample predictive performance of many different specifications of estimated DSGE models and various classes of VAR models, using datasets from the US economy. Simple and hybrid DSGE models are implemented, such as DSGEVAR and tested against standard, Bayesian and Factor Augmented VARs. In this study we focus on a Factor Augmented DSGE model that is estimated using Bayesian approaches. The investigated period spans 1960:Q4 to 2010:Q4 for the real GDP, the harmonized CPI and the nominal shortterm interest rate. We produce their forecasts for the outofsample testing period 1997:Q12010:Q4. This comparative validation can be useful to monetary policy analysis and macroforecasting with the use of advanced Bayesian methods. 
Keywords:  Bayesian estimation, Forecasting, MetropolisHastings, Markov chain monte carlo, Marginal data density, Factor Augmented DSGE 
JEL:  C11 C15 C32 
Date:  2013–04 
URL:  http://d.repec.org/n?u=RePEc:rim:rimwps:22_13&r=for 
By:  Barbara Rossi; Tatevik Sehkposyan 
Abstract:  We propose new methods for evaluating predictive densities. The methods include KolmogorovSmirnov and Cramervon Misestype tests for the correct specification of predictive densities robust to dynamic misspecification. The novelty is that the tests can detect misspecification in the predictive densities even if it appears only over a fraction of the sample, due to the presence of instabilities. Our results indicate that our tests are well sized and have good power in detecting misspecification in predictive densities, even when it is timevarying. An application to density forecasts of the Survey of Professional Forecasters demonstrates the usefulness of the proposed methodologies. 
Keywords:  predictive density, dynamic misspecification, instability, structural change, forecast evaluation 
JEL:  C22 C52 C53 
Date:  2013–02 
URL:  http://d.repec.org/n?u=RePEc:bge:wpaper:688&r=for 
By:  Wojciech Charemza; Carlos Diaz Vela; Svetlana Makarova 
Abstract:  Issues related to classification, interpretation and estimation of inflationary uncertainties are addressed in the context of their application for constructing probability forecasts of inflation. It is shown that confusions in defining uncertainties lead to potential misunderstandings of such forecasts. The principal source of such confusion is in ignoring the effect of feedback from the policy action undertaken on the basis of forecasts of inflation onto uncertainties. In order to resolve this problem a new class of skew normal distributions (weighted skew normal, WSN) have been proposed and its properties derived. It is shown that parameters of WSN distribution can be interpreted in relation to the monetary policy strength and symmetry. It has been fitted to empirical distributions of inflation multistep forecast errors of inflation for 34 countries, alongside others distributions already existing in the literature. The estimation method applied is using the minimum distance criteria between the empirical and theoretical distributions. Results lead to some constructive conclusions regarding the strength and asymmetry of monetary policy and confirm the applicability of WSN to producing probabilistic forecasts of inflation. 
Keywords:  inflation forecasting; uncertainty; monetary policy; nonnormality 
JEL:  C54 E37 E52 
Date:  2013–05 
URL:  http://d.repec.org/n?u=RePEc:lec:leecon:13/06&r=for 
By:  Thomas Schelkle (London School of Economics) 
Abstract:  This paper asks which theories of mortgage default are quantitatively consistent with observations in the United States during 20022010. Theoretical models are simulated for the observed timeseries of aggregate house prices. Their predictions are then compared to actual default rates on prime fixedrate mortgages. An outofsample test discriminates between estimated reduced forms of the two most prominent theories. The test reveals that the doubletrigger hypothesis attributing mortgage default to the joint occurrence of negative equity and a life event like unemployment outperforms a frictionless optiontheoretic default model. Based on this finding a structural partialequilibrium model with liquidity constraints and idiosyncratic unemployment shocks is presented to provide microfoundations for the doubletrigger hypothesis. In this model borrowers with negative equity are more likely to default when they are unemployed and have low liquid wealth. The model explains most of the observed strong rise in mortgage default rates. A policy implication of the model is that subsidizing homeowners can mitigate a mortgage crisis at a lower cost than bailing out lenders. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:red:sed012:751&r=for 
By:  Paolo Bonomolo (University of Pavia); Guido Ascari (Universita degli Studi di Pavia) 
Abstract:  We propose a generalization of the rational expectations (RE) hypothesis: as in the original approach by Muth (1961), the case of multiple solutions is the natural case, and expectations are formed by randomizing across the infinite RE solutions. We call our approach: "rational sunspots". The infinite solutions differ in the way agents form their expectations, or more precisely in the way agents weight past data to make forecasts. It follows that our approach naturally yields drifting parameters and stochastic volatility. It also allows for the possibility of temporary explosive paths. Moreover, a simple method to distinguish between determinacy and indeterminacy is based on the Normality of the likelihood. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:red:sed012:743&r=for 
By:  Comte, Fabienne 
Abstract:  We consider the estimation of the slope function in functional linear regression, where scalar responses are modeled in dependence of random functions. Cardot and Johannes [J. Multivariate Anal. 101 (2010) 395–408] have shown that a thresholded projection estimator can attain up to a constant minimaxrates of convergence in a general framework which allows us to cover the prediction problem with respect to the mean squared prediction error as well as the estimation of the slope function and its derivatives. This estimation procedure, however, requires an optimal choice of a tuning parameter with regard to certain characteristics of the slope function and the covariance operator associated with the functional regressor. As this information is usually inaccessible in practice, we investigate a fully datadriven choice of the tuning parameter which combines model selection and Lepski’s method. It is inspired by the recent work of Goldenshluger and Lepski [Ann. Statist. 39 (2011) 1608–1632]. The tuning parameter is selected as minimizer of a stochastic penalized contrast function imitating Lepski’s method among a random collection of admissible values. This choice of the tuning parameter depends only on the data and we show that within the general framework the resulting datadriven thresholded projection estimator can attain minimaxrates up to a constant over a variety of classes of slope functions and covariance operators. The results are illustrated considering different configurations which cover in particular the prediction problem as well as the estimation of the slope and its derivatives. A simulation study shows the reasonable performance of the fully datadriven estimation procedure. 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:ner:louvai:info:hdl:2078.1/127327&r=for 
By:  Timmermans, Catherine 
Abstract:  Our goal is to predict a scalar value or a group membership from the discretized observation of curves with sharp local features that might vary both vertically and horizontally. To this aim, we propose to combine the use of the nonparametric functional regression estimator developed by Ferraty and Vieu (2006) [18] with the Bagidis semimetric developed by Timmermans and von Sachs (submitted for publication) [36] with a view of efficiently measuring dissimilarities between curves with sharp patterns. This association is revealed as powerful. Under quite general conditions, we first obtain an asymptotic expansion for the small ball probability indicating that Bagidis induces a fractal topology on the functional space. We then provide the rate of convergence of the nonparametric regression estimator in this case, as a function of the parameters of the Bagidis semimetric. We propose to optimize those parameters using a crossvalidation procedure, and show the optimality of the selected vector. This last result has a larger scope and concerns the optimization of any vector parameter characterizing a semimetric used in this context. The performances of our methodology are assessed on simulated and real data examples. Results are shown to be superior to those obtained using competing semimetrics as soon as the variations of the significant sharp patterns in the curves have a horizontal component. 
Date:  2013 
URL:  http://d.repec.org/n?u=RePEc:ner:louvai:info:hdl:2078.1/118369&r=for 