nep-for New Economics Papers
on Forecasting
Issue of 2012‒03‒21
eighteen papers chosen by
Rob J Hyndman
Monash University

  1. Does the Box-Cox Transformation Help in Forecasting Macroeconomic Time Series? By Tommaso Proietti; Helmut Luetkepohl
  2. Constructing Optimal Density Forecasts from Point Forecast Combinations By Luiz Renato Regis de Oliveira Lima; Wagner Piazza Gaglianone
  3. DSGE model-based forecasting By Marco Del Negro; Frank Schorfheide
  4. Systemic Real and Financial Risks: Measurement, Forecasting, and Stress Testing By Gianni De Nicoló; Marcella Lucchetta
  5. The Forecasting Performance of an Estimated Medium Run Model By Tobias Kitlinski; Torsten Schmidt
  6. Forecasting world output: the rising importance of emerging economies By Alessandro Borin; Riccardo Cristadoro; Roberto Golinelli; Giuseppe Parigi
  7. A New Structural Break Model with Application to Canadian Inflation Forecasting By John M Maheu; Yong Song
  8. Hierarchical Shrinkage in Time-Varying Parameter Models By Miguel Belmonte; Gary Koop; Dimitris Korobilis
  9. Large time-varying parameter VARs By Gary Koop; Dimitris Korobils
  10. A Foreign Activity Measure for Predicting Canadian Exports By Louis Morel
  11. A Model for Predicting Readmission Risk in New Zealand. By Rhema Vaithianathan; Nan Jiang; Toni Ashton
  12. A New Model of Trend Inflation By Joshua Chan; Gary Koop; Simon Potter
  13. Vector Autoregressive Models By Helmut Luetkepohl
  14. Bounded Rationality and Limited Datasets: Testable Implications, Identifiability, and Out-of-Sample Prediction By Geoffroy de Clippel; Kareen Rozen
  15. Posterior Predictive Analysis for Evaluating DSGE Models By Jon Faust; Abhishek Gupta
  16. Assessing Multiple Prior Models of Behaviour under Ambiguity By Ana Conte; John D. Hey
  17. Consistent Long-Term Yield Curve Prediction By Josef Teichmann; Mario V. W\"uthrich
  18. Earnings Growth versus Measures of Income and Education for Predicting Mortality By Harriet Orcutt Duleep; David Jaeger

  1. By: Tommaso Proietti; Helmut Luetkepohl
    Abstract: The paper investigates whether transforming a time series leads to an improvement in forecasting accuracy. The class of transformations that is considered is the Box-Cox power transformation, which applies to series measured on a ratio scale. We propose a nonparametric approach for estimating the optimal transformation parameter based on the frequency domain estimation of the prediction error variance, and also conduct an extensive recursive forecast experiment on a large set of seasonal monthly macroeconomic time series related to industrial production and retail turnover. In about one fifth of the series considered the Box-Cox transformation produces forecasts significantly better than the untransformed data at one-step-ahead horizon; in most of the cases the logarithmic transformation is the relevant one. As the forecast horizon increases, the evidence in favour of a transformation becomes less strong. Typically, the na¨ive predictor that just reverses the transformation leads to a lower mean square error than the optimal predictor at short forecast leads. We also discuss whether the preliminary in-sample frequency domain assessment conducted provides a reliable guidance which series should be transformed for improving significantly the predictive performance.
    Keywords: Forecasts comparisons. Multi-step forecasting. Rolling forecasts. Nonparametric estimation of prediction error variance.
    JEL: C22
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2011/29&r=for
  2. By: Luiz Renato Regis de Oliveira Lima; Wagner Piazza Gaglianone
    Abstract: Decision makers often observe point forecasts of the same variable computed, for instance, by commercial banks, IMF, World Bank, but the econometric models used by such institutions are unknown. This paper shows how to use the information available at point forecasts to compute optimal density forecasts. Our idea builds upon the combination of point forecasts under general loss functions and unknonwn forecast error distributions. We use real-time data to forecast the density of future in‡ation in the U.S. and our results indicate that the proposed method materially improves the real-time accuracy of density forecasts vis-à-vis the ones obtained from the (unknown) individual. econometric models.
    Keywords: forecast combination,quantile regression,density forecast
    JEL: C13 C14 C51
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ppg:ppgewp:5&r=for
  3. By: Marco Del Negro; Frank Schorfheide
    Abstract: Dynamic stochastic general equilibrium (DSGE) models use modern macroeconomic theory to explain and predict comovements of aggregate time series over the business cycle and to perform policy analysis. We explain how to use DSGE models for all three purposes—forecasting, story telling, and policy experiments—and review their forecasting record. We also provide our own real-time assessment of the forecasting performance of the Smets and Wouters (2007) model data up to 2011, compare it with Blue Chip and Greenbook forecasts, and show how it changes as we augment the standard set of observables with external information from surveys (nowcasts, interest rate forecasts, and expectations for long-run inflation and output growth). We explore methods of generating forecasts in the presence of a zero-lower-bound constraint on nominal interest rates and conditional on counterfactual interest rate paths. Finally, we perform a postmortem of DSGE model forecasts of the Great Recession and show that forecasts from a version of the Smets-Wouters model augmented by financial frictions, and using spreads as an observable, compare well with Blue Chip forecasts.
    Keywords: Stochastic analysis ; Equilibrium (Economics) ; Time-series analysis ; Econometric models ; Monetary policy ; Economic forecasting ; Recessions
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fednsr:554&r=for
  4. By: Gianni De Nicoló; Marcella Lucchetta
    Abstract: This paper formulates a novel modeling framework that delivers: (a) forecasts of indicators of systemic real risk and systemic financial risk based on density forecasts of indicators of real activity and financial health; (b) stress-tests as measures of the dynamics of responses of systemic risk indicators to structural shocks identified by standard macroeconomic and banking theory. Using a large number of quarterly time series of the G-7 economies in 1980Q1-2010Q2, we show that the model exhibits significant out-of sample forecasting power for tail real and financial risk realizations, and that stress testing provides useful early warnings on the build-up of real and financial vulnerabilities.
    Keywords: Economic indicators , Financial risk , Forecasting models , Group of seven , Time series ,
    Date: 2012–02–28
    URL: http://d.repec.org/n?u=RePEc:imf:imfwpa:12/58&r=for
  5. By: Tobias Kitlinski; Torsten Schmidt
    Abstract: In recent times DSGE models came more and more into the focus of forecasters and showed promising forecast performances for the short term. We contribute to the existing literature by analyzing the forecast power of a DSGE model including endogenous growth for the medium run. Instead of only calibrating the model we apply a mixture of calibrating and estimating using Bayesian estimation methods. As forecasting benchmarks we take the Smets-Wouters model (2007) and a VAR model. The evaluation of the forecast errors shows that the Medium-Term model outperforms the Smets-Wouters model with respect to some key macroeconomic variables in the medium run. Compared to the VAR model the Medium-Term model forecast performance is competitive. These results show that the forecast ability of DSGE models is also valid for the medium term.
    Keywords: Bayesian analysis; DSGE model; medium run; forecasting
    JEL: C32 C52 E32 E37
    Date: 2011–12
    URL: http://d.repec.org/n?u=RePEc:rwi:repape:0301&r=for
  6. By: Alessandro Borin (Bank of Italy); Riccardo Cristadoro (Bank of Italy); Roberto Golinelli (University of Bologna); Giuseppe Parigi (Bank of Italy)
    Abstract: Assessing the global economic outlook is a fundamentally important task of international financial institutions, governments and central banks. In this paper we focus on the consequences of the rapid growth of emerging markets for monitoring and forecasting the global outlook. Our main results are that (i) the rise of the emerging countries has sharply altered the correlation of growth rates among the main economic areas; (ii) this is clearly detectable in forecasting equations as a structural break occurring in the 1990s; (iii) hence, inferences on global developments based solely on the industrialized countries are highly unreliable; (iv) the otherwise cumbersome task of monitoring many – and less studied – countries can be tackled by resorting to very simple bridge models (BM); (v) BM performance is in line with that of the most widely quoted predictions (WEO, Consensus) both before and during the recent crisis; (vi) for some emerging economies, BMs would have provided even better forecasts during the recent crisis.
    Keywords: GDP forecast, emerging and Asian markets, bridge models, forecasting ability
    JEL: C22 C53 E37 F47
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_853_12&r=for
  7. By: John M Maheu; Yong Song
    Abstract: This paper develops an efficient approach to model and forecast time-series data with an unknown number of change-points. Using a conjugate prior and conditional on time-invariant parameters, the predictive density and the posterior distribution of the change-points have closed forms. The conjugate prior is further modeled as hierarchical to exploit the information across regimes. This framework allows breaks in the variance, the regression coefficients or both. Regime duration can be modeled as a Poisson distribution. A new efficient Markov Chain Monte Carlo sampler draws the parameters as one block from the posterior distribution. An application to Canada inflation time series shows the gains in forecasting precision that our model provides.
    Keywords: multiple change-points, regime duration, inflation targeting, predictive density, MCMC
    JEL: C5 C22 C01 C11 E37
    Date: 2012–03–13
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-448&r=for
  8. By: Miguel Belmonte (Department of Economics, University of Strathclyde); Gary Koop (Department of Economics); Dimitris Korobilis (Department of Economics, University of Glasgow)
    Abstract: In this paper, we forecast EU-area inflation with many predictors using time-varying parameter models. The facts that time-varying parameter models are parameter-rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time-varying parameter models. Our approach allows for the coefficient on each predictor to be: i) time varying, ii) constant over time or iii) shrunk to zero. The econometric methodology decides automatically which category each coefficient belongs in. Our empirical results indicate the benefits of such an approach.
    Keywords: Forecasting; hierarchical prior; time-varying parameters; Bayesian Lasso
    JEL: C11 C52 E37 E47
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1137&r=for
  9. By: Gary Koop; Dimitris Korobils
    Abstract: In this paper we develop methods for estimation and forecasting in large time-varying parameter vector autoregressive models (TVP-VARs). To overcome computational constraints with likelihood-based estimation of large systems, we rely on Kalman filter estimation with forgetting factors. We also draw on ideas from the dynamic model averaging literature and extend the TVP-VAR so that its dimension can change over time. A final extension lies in the development of a new method for estimating, in a time-varying manner, the parameter(s) of the shrinkage priors commonly-used with large VARs. These extensions are operationalized through the use of forgetting factor methods and are, thus, computationally simple. An empirical application involving forecasting inflation, real output, and interest rates demonstrates the feasibility and usefulness of our approach.
    Keywords: Bayesian VAR; forecasting; time-varying coefficients; state-space model
    JEL: C11 C52 E27 E37
    Date: 2012–01
    URL: http://d.repec.org/n?u=RePEc:gla:glaewp:2012_04&r=for
  10. By: Louis Morel
    Abstract: The author constructs a measure of foreign activity that takes into account the composition of foreign demand for Canadian exports. It has a number of interesting features. First, the foreign activity measure captures both the composition of demand in the United States (by including components of U.S. private final domestic demand) and economic activity outside of the United States. Second, its coefficients have been estimated over the sample period 1981–2009 controlling for the effect of changes in relative prices. Third, compared with the Bank’s previous U.S. activity index (introduced in the July 2009 Monetary Policy Report), the foreign activity measure provides some improvements for forecasting Canadian exports, especially at longer horizons. For instance, at eight quarters ahead, the gain in terms of forecast accuracy is as much as 22 per cent. Finally, the foreign activity measure helps to explain why Canadian exports dropped by 20 per cent during the global recession of 2008–09 and have only partially recovered since that time.
    Keywords: Balance of payments and components; Exchange rates; Recent economic and financial developments
    JEL: E00 F17
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:bca:bocadp:12-1&r=for
  11. By: Rhema Vaithianathan (Department of Economics, University of Auckland, Auckland, New Zealand.); Nan Jiang (Department of Economics, Auckland University of Technology, Auckland, New Zealand.); Toni Ashton (School of Population Health, University of Auckland, Auckland, New Zealand.)
    Abstract: Predictive Risk Models which utilize routinely collected data to develop algorithms are used in England to stratify patients according to their hospital admission risk. An individual’s risk score can be used as a basis to select patients for hospital avoidance programmes. This paper presents a brief empirical analysis of New Zealand hospital data to create a prediction algorithm and illustrates how a hospital avoidance business case can be developed using the model. A sample of 134,262 patients was analyzed in a Multivariate logistic regression, various socioeconomic factors and indictors of previous admissions were used to predict the probability that a patient is readmitted to hospital within the 12 months following discharge. The key factors for readmission prediction were age, sex, diagnosis of last admission, length of stay and cost-weight of previous admission. The prognostic strength of the algorithm was good, with a randomly selected patient with a future re-admission being 71.2% more likely to receive a higher risk score than one who will not have a future admission.
    Keywords: Hospital readmission; Risk prediction; Prognostic strength.
    JEL: I10 C13 O22
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:aut:wpaper:201202&r=for
  12. By: Joshua Chan (College of Business and Economics, Australian National University); Gary Koop (Department of Economics, University of Strathclyde); Simon Potter (Federal Reserve Bank of New York)
    Abstract: This paper introduces a new model of trend (or underlying) inflation. In contrast to many earlier approaches, which allow for trend inflation to evolve according to a random walk, ours is a bounded model which ensures that trend inflation is constrained to lie in an interval. The bounds of this interval can either be fixed or estimated from the data. Our model also allows for a time-varying degree of persistence in the transitory component of inflation. The bounds placed on trend inflation mean that standard econometric methods for estimating linear Gaussian state space models cannot be used and we develop a posterior simulation algorithm for estimating the bounded trend inflation model. In an empirical exercise with CPI inflation we find the model to work well, yielding more sensible measures of trend inflation and forecasting better than popular alternatives such as the unobserved components stochastic volatility model.
    Keywords: Constrained inflation, non-linear state space model, underlying inflation, inflation targeting, inflation forecasting, Bayesian
    JEL: E31 E37 C11 C53
    Date: 2012–02
    URL: http://d.repec.org/n?u=RePEc:str:wpaper:1202&r=for
  13. By: Helmut Luetkepohl
    Abstract: Multivariate simultaneous equations models were used extensively for macroeconometric analysis when Sims (1980) advocated vector autoregressive (VAR) models as alternatives. At that time longer and more frequently observed macroeconomic time series called for models which described the dynamic structure of the variables. VAR models lend themselves for this purpose. They typically treat all variables as a priori endogenous. Thereby they account for Sims’ critique that the exogeneity assumptions for some of the variables in simultaneous equations models are ad hoc and often not backed by fully developed theories. Restrictions, including exogeneity of some of the variables, may be imposed on VAR models based on statistical procedures. VAR models are natural tools for forecasting. Their setup is such that current values of a set of variables are partly explained by past values of the variables involved. They can also be used for economic analysis, however, because they describe the joint generation mechanism of the variables involved. Structural VAR analysis attempts to investigate structural economic hypotheses with the help of VAR models. Impulse response analysis, forecast error variance decompositions, historical decompositions and the analysis of forecast scenarios are the tools which have been proposed for disentangling the relations between the variables in a VAR model. Traditionally VAR models are designed for stationary variables without time trends. Trending behavior can be captured by including deterministic polynomial terms. In the 1980s the discovery of the importance of stochastic trends in economic variables and the development of the concept of cointegration by Granger (1981), Engle and Granger (1987), Johansen (1995) and others have shown that stochastic trends can also be captured by VAR models. If there are trends in some of the variables it may be desirable to separate the long-run relations from the short-run dynamics of the generation process of a set of variables. Vector error correction models offer a convenient framework for separating longrun and short-run components of the data generation process (DGP). In the present chapter levels VAR models are considered where cointegration relations are not modelled explicitly although they may be present. Specific issues related to trending variables will be mentioned occasionally throughout the chapter. The advantage of levels VAR models over vector error correction models is that they can also be used when the cointegration structure is unknown. Cointegration analysis and error correction models are discussed specifically in the next chapter.
    JEL: C32
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2011/30&r=for
  14. By: Geoffroy de Clippel (Dept. of Economics, Brown University); Kareen Rozen (Cowles Foundation, Yale University)
    Abstract: Theories of bounded rationality are typically characterized over an exhaustive data set. This paper aims to operationalize some leading theories when the available data is limited, as is the case in most practical settings. How does one tell if observed choices are consistent with a theory of bounded rationality if the data is incomplete? What information can be identified about preferences? How can out-of-sample predictions be made? Our approach is contrasted with earlier attempts to examine bounded rationality theories on limited data, showing their notion of consistency is inappropriate for identifiability and out-of-sample prediction.
    Keywords: Bounded rationality, Limited datasets
    JEL: D01
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1853&r=for
  15. By: Jon Faust; Abhishek Gupta
    Abstract: While dynamic stochastic general equilibrium (DSGE) models for monetary policy analysis have come a long way, there is considerable difference of opinion over the role these models should play in the policy process. The paper develops three main points about assessing the value of these models. First, we document that DSGE models continue to have aspects of crude approximation and omission. This motivates the need for tools to reveal the strengths and weaknesses of the models--both to direct development efforts and to inform how best to use the current flawed models. Second, posterior predictive analysis provides a useful and economical tool for finding and communicating strengths and weaknesses. In particular, we adapt a form of discrepancy analysis as proposed by Gelman, et al. (1996). Third, we provide a nonstandard defense of posterior predictive analysis in the DSGE context against long-standing objections. We use the iconic Smets-Wouters model for illustrative purposes, showing a number of heretofore unrecognized properties that may be important from a policymaking perspective.
    JEL: C52 E1 E32 E37
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:17906&r=for
  16. By: Ana Conte (University of Westminster, London, UK, and Max-Planck-Institute of Economics, Jena); John D. Hey (University of York, UK)
    Abstract: The recent spate of theoretical models of behaviour under ambiguity can be partitioned into two sets : those involving multiple priors (in which the probabilities of the various events are not known but probabilities can be attached to the various possible values for the probabilities) and those not involving multiple priors. This paper concentrates on the first set and provides an experimental investigation into recently proposed theories. Using an appropriate experimental interface, in which the probabilities on the various possibilities are explicitly stated, we examine the fitted and predictive power of the various theories. We first estimate subject-by-subject, and then we estimate and predict using a mixture model over the contending theories. The individual estimates suggest that 25% of our 149 subjects have behaviour consistent with Expected Utility, 54% with the Smooth Model (of Klibanoff et al, 2005), 12% with Rank Dependent Expected Utility and 9% with the Alpha Model (of Ghirardato et al 2004); these figures are very close to the mixing proportions obtained from the mixture estimates. However, if we classify our subjects through the posterior probabilities (given all the evidence) of each of them being of the various types: using the estimates we get 36%, 19%, 28% and 11% (for EU, Smooth, Rank Dependent and Alpha); while using the predictions 36%, 19%, 33% and 16%. Interestingly the older models (EU and RD) seem to fare relatively better, suggesting that representing ambiguity through multiple priors is not perceived as the correct representation by subjects.
    Keywords: Alpha Model, Ambiguity, Expected Utility, Mixture Models, Rank Dependent Expected Utility, Smooth Model
    JEL: D81 C91 C23
    Date: 2012–01–13
    URL: http://d.repec.org/n?u=RePEc:jrp:jrpwrp:2011-068&r=for
  17. By: Josef Teichmann; Mario V. W\"uthrich
    Abstract: We present an arbitrage-free non-parametric yield curve prediction model which takes the full (discretized) yield curve as state variable. We believe that absence of arbitrage is an important model feature in case of highly correlated data, as it is the case for interest rates. Furthermore, the model structure allows to separate clearly the tasks of estimating the volatility structure and of calibrating market prices of risk. The empirical part includes tests on modeling assumptions, back testing and a comparison with the Vasi\v{c}ek short rate model.
    Date: 2012–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1203.2017&r=for
  18. By: Harriet Orcutt Duleep (Research Professor of Public Policy, Thomas Jefferson Program in Public Policy, The College of William & Mary); David Jaeger (Professor of Economics, The Graduate Center, City University of New York)
    Abstract: This paper begins an exploration to determine whether earnings growth, as a measure of the propensity to invest in human capital, is a valuable variable for predicting mortality. To insure its robustness and general applicability to ongoing Social Security models, the usefulness of earnings growth as a predictor of mortality will be explored in multiple time periods. This paper begins that process by reporting preliminary results for an early time period using the 1973 CPS-SSA-IRS Exact Match file. In addition to presenting preliminary results, the paper also describes how data challenges associated with the pre-1978 administrative record data on earnings and mortality are met.
    Date: 2011–09
    URL: http://d.repec.org/n?u=RePEc:mrr:papers:wp257&r=for

This nep-for issue is ©2012 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.