Forecasting
http://lists.repec.org/mailman/listinfo/nep-for
Forecasting2014-11-22Rob J HyndmanThe Role of Economic Policy Uncertainty in Forecasting US Inflation Using a VARFIMA Model
http://d.repec.org/n?u=RePEc:pre:wpaper:201460&r=for
We compare inflation forecasts of a vector fractionally integrated autoregressive moving average (VARFIMA) model against standard forecasting models. U.S. inflation forecasts improve when controlling for persistence and economic policy uncertainty (EPU). Importantly, the VARFIMA model, comprising of inflation and EPU, outperforms commonly used inflation forecast models.Mehmet Balcilar, Rangan Gupta, Charl Jooste2014-10Inflation, long-range dependency, economic policy uncertaintyGDPNow: A Model for GDP "Nowcasting"
http://d.repec.org/n?u=RePEc:fip:fedawp:2014-07&r=for
This paper documents GDPNow, a "nowcasting" model for gross domestic product (GDP) growth that synthesizes the "bridge equation" approach relating GDP subcomponents to monthly source data with the factor model approach used by Giannone, Reichlin, and Small (2008). The GDPNow model forecasts GDP growth by aggregating 13 subcomponents that make up GDP with the chain-weighting methodology used by the U.S. Bureau of Economic Analysis. Using current vintage data, out-of-sample GDPNow model forecasts are found to be more accurate than a number of statistical benchmarks since 2000. Using real-time data since the second-half of 2011, GDPNow model forecasts are found to be only slightly inferior to consensus near-term GDP forecasts from Blue Chip Economic Indicators. The forecast error variance of GDP growth for each of the GDPNow model, Blue Chip, and the Federal Reserve staff's Green Book is decomposed as the sum of the forecast error covariances for the contributions to growth of the subcomponents of GDP. The decompositions show that "net exports" and "change in private inventories" are particularly difficult subcomponents to nowcast.Higgins, Patrick C.2014-07-01nowcasting; forecasting; macroeconometric forecastingEvaluating a Structural Model Forecast: Decomposition Approach
http://d.repec.org/n?u=RePEc:cnb:rpnrpn:2014/02&r=for
Macroeconomic forecasters are often criticized for a lack of transparency when presenting their forecasts. To deter such criticism, the transparency of the forecasting process should be enhanced by tracing and explaining the effects of data revisions and expert judgment updates on variations in the forecasts. This paper presents a forecast decomposition analysis framework designed to examine the differences between two forecasts generated by a linear structural model. The differences between the forecasts considered can be decomposed into the contributions of various forecast elements, such as the effect of new data or expert judgment. The framework allows us to evaluate the contributions of forecast assumptions in the presence of expert judgment applied in the expected way. The simplest application of this framework examines alternative forecast scenarios with different forecast assumptions. Next, a one-period difference between the forecasts’ initial periods is added to the examination. Finally, a replication of the Inflation Forecast Evaluation presented in Inflation Report III/2013 is created to illustrate the full capabilities of the decomposition framework.Frantisek Brazdik, Zuzana Humplova, Frantisek Kopriva2014-08Data revisions, DSGE models, forecasting, forecast revisionsSpotting the Danger Zone - Forecasting Financial Crises with Classification Tree Ensembles and Many Predictors
http://d.repec.org/n?u=RePEc:bon:bonedp:bgse01_2014&r=for
To improve the detection of the economic ”danger zones” from which severe banking crises emanate, this paper introduces classification tree ensembles to the banking crisis forecasting literature. I show that their out-of-sample performance in forecasting binary banking crisis indicators surpasses current best-practice early warning systems based on logit models by a substantial margin. I obtain this result on the basis of one long-run- (1870-2011), as well as two broad post-1970 macroeconomic panel datasets. I particularly show that two marked improvements in forecasting performance result from the combination of many classification trees into an ensemble, and the use of many predictors.Felix Ward2014-10Score Driven exponentially Weighted Moving Average and Value-at-Risk Forecasting
http://d.repec.org/n?u=RePEc:dgr:uvatin:20140092&r=for
We present a simple new methodology to allow for time variation in volatilities using a recursive updating scheme similar to the familiar RiskMetrics approach. We update parameters using the score of the forecasting distribution rather than squared lagged observations. This allows the parameter dynamics to adapt automatically to any non-normal data features and robustifies the subsequent volatility estimates. Our new approach nests several extensions to the exponentially weighted moving average (EWMA) scheme as proposed earlier. Our approach also easily handles extensions to dynamic higher-order moments or other choices of the preferred forecasting distribution. We apply our method to Value-at-Risk forecasting with Student's t distributions and a time varying degrees of freedom parameter and show that the new method is competitive to or better than earlier methods for volatility forecasting of individual stock returns and exchange rates.Andr� Lucas, Xin Zhang2014-07-22dynamic volatilities, time varying higher order moments, integrated generalized autoregressive score models, Exponential Weighted Moving Average (EWMA), Value-at-Risk (VaR)Density Forecast Evaluation in Unstable Environments
http://d.repec.org/n?u=RePEc:ucr:wpaper:201428&r=for
Gloria Gonzalez-Rivera, Yingying Sun2014-08Dynamic Model Averaging in Large Model Spaces Using Dynamic Occam's Window
http://d.repec.org/n?u=RePEc:arx:papers:1410.7799&r=for
Bayesian model averaging has become a widely used approach to accounting for uncertainty about the structural form of the model generating the data. When data arrive sequentially and the generating model can change over time, Dynamic Model Averaging (DMA) extends model averaging to deal with this situation. Often in macroeconomics, however, many candidate explanatory variables are available and the number of possible models becomes too large for DMA to be applied in its original form. We propose a new method for this situation which allows us to perform DMA without considering the whole model space, but using a subset of models and dynamically optimizing the choice of models at each point in time. This yields a dynamic form of Occam's window. We evaluate the method in the context of the problem of nowcasting GDP in the Euro area. We find that its forecasting performance compares well that of other methods. Keywords: Bayesian model averaging; Model uncertainty; Nowcasting; Occam's window.Luca Onorante, Adrian E. Raftery2014-10