nep-for New Economics Papers
on Forecasting
Issue of 2007‒02‒10
ten papers chosen by
Rob J Hyndman
Monash University

  1. How Far Can Forecasting Models Forecast? Forecast Content Horizons for Some Important Macroeconomic Variables By John W. Galbraith; Greg Tkacz
  2. A Panel Data Approach to Economic Forecasting: The Bias-Corrected Average Forecast By João Victor Issler; Luiz Renato Regis de Oliveira Lima
  3. Open economy DSGE-VAR forecasting and policy analysis - head to head with the RBNZ published forecasts By Kirdan Lees; Troy Matheson; Christie Smith
  4. Modeling foreign exchange rates with jumps By John M Maheu; Thomas H McCurdy
  5. Testing the New Keynesian Phillips curve through Vector Autoregressive models: Results from the Euro area By Fanelli, Luca
  6. Predicting recessions with leading indicators: An application on the Icelandic economy By Bruno Eklund
  7. Constants do not stay constant because variables are varying By Kattai, Rasmus
  8. The Taylor rule: can it be supported by the data? By Leon, Costas
  9. Nonparametric estimation of time-varying covariance matrix in a slowly changing vector random walk model By Feng, Yuanhua; Yu, Keming
  10. Nowcasting and predicting data revisions in real time using qualitative panel survey data By Troy Matheson; James Mitchell; Brian Silverstone

  1. By: John W. Galbraith; Greg Tkacz
    Abstract: For stationary transformations of variables, there exists a maximum horizon beyond which forecasts can provide no more information about the variable than is present in the unconditional mean. Meteorological forecasts, typically excepting only experimental or exploratory situations, are not reported beyond this horizon; by contrast, little generally accepted information about such maximum horizons is available for economic variables. The authors estimate such content horizons for a variety of economic variables, and compare these with the maximum horizons that they observe reported in a large sample of empirical economic forecasting studies. The authors find that many published studies provide forecasts exceeding, often by substantial margins, their estimates of the content horizon for the particular variable and frequency. The authors suggest some simple reporting practices for forecasts that could potentially bring greater transparency to the process of making and interpreting economic forecasts.
    Keywords: Econometric and statistical methods, Business fluctuations and cycles
    JEL: C53
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:07-1&r=for
  2. By: João Victor Issler (EPGE/FGV); Luiz Renato Regis de Oliveira Lima (EPGE/FGV)
    Date: 2007–01
    URL: http://d.repec.org/n?u=RePEc:fgv:epgewp:642&r=for
  3. By: Kirdan Lees; Troy Matheson; Christie Smith (Reserve Bank of New Zealand)
    Abstract: We evaluate the performance of an open economy DSGE-VAR model for New Zealand along both forecasting and policy dimensions. We show that forecasts from a DSGE-VAR and a 'vanilla' DSGE model are competitive with, and in some dimensions superior to, the Reserve Bank of New Zealand's official forecasts. We also use the estimated DSGE-VAR structure to identify optimal policy rules that are consistent with the Reserve Bank's Policy Targets Agreement. Optimal policy rules under parameter uncertainty prove to be relatively similar to the certainty case. The optimal policies react aggressively to inflation and contain a large degree of interest rate smoothing, but place a low weight on responding to output or the change in the nominal exchange rate.
    JEL: C51 E52 F41
    Date: 2007–01
    URL: http://d.repec.org/n?u=RePEc:nzb:nzbdps:2007/01&r=for
  4. By: John M Maheu; Thomas H McCurdy
    Abstract: We propose a new discrete-time model of returns in which jumps capture persistence in the conditional variance and higher-order moments. Jump arrival is governed by a heterogeneous Poisson process. The intensity is directed by a latent stochastic autoregressive process, while the jump-size distribution allows for conditional heteroskedasticity. Model evaluation focuses on the dynamics of the conditional distribution of returns using density and variance forecasts. Predictive likelihoods provide a period-by-period comparison of the performance of our heterogeneous jump model relative to conventional SV and GARCH models. Further, in contrast to previous studies on the importance of jumps, we utilize realized volatility to assess out-of-sample variance forecasts.
    Keywords: jump clustering, jump dynamics, MCMC, predictive likelihood, realized volatility, Bayesian model average
    JEL: C22 C11 G1
    Date: 2007–02–02
    URL: http://d.repec.org/n?u=RePEc:tor:tecipa:tecipa-279&r=for
  5. By: Fanelli, Luca
    Abstract: This paper addresses the issue of testing the 'hybrid' New Keynesian Phillips Curve (NKPC) through Vector Autoregressive (VAR) systems and likelihood methods, giving special emphasis to the case where variables are non stationary. The idea is to use a VAR for both the inflation rate and the explanatory variable(s) to approximate the dynamics of the system and derive testable restrictions. Attention is focused on the 'inexact' formulation of the NKPC. Empirical results over the period 1971-1998 show that the NKPC is far from being a `good first approximation' of inflation dynamics in the Euro area.
    Keywords: Inflation dynamics; Forecast model; New Keynesian Phillips Curve; Forward-looking behavior; VAR expectations.
    JEL: C32 C52 E31
    Date: 2005–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:1617&r=for
  6. By: Bruno Eklund
    Abstract: This paper focuses on the Stock and Watson methodology to fore- cast the future state of the business cycle in the Icelandic economy. By selecting variables available on a monthly basis that mimic the cyclical behaviour of the quarterly GDP, coincident and leading vari- ables are identi?ed. A factor model is then speci?ed based on the assumption that a single common unobservable element drives the cyclical evolution of many of the Icelandic macroeconomic variables. The model is cast into a state space form providing a simple frame- work both for estimation and for predicting the future recession and expansion patterns. Based on the bootstrap resampling technique, a simple approach to estimate recession and expansion probabilities is developed. This method is completely nonparametric compared to the semi-parametric approach used by Stock and Watson.
    Date: 2007–01
    URL: http://d.repec.org/n?u=RePEc:ice:wpaper:wp33_bruno&r=for
  7. By: Kattai, Rasmus
    Abstract: This paper focuses on the dynamic properties of error correction models (ECM). It is shown that the absence of structural breaks in the cointegrating vector does not necessarily imply that also all parameters of the dynamic specification of the ECM are time invariant. In some cases, depending on the data generating process of regressors, the intercept has to be time varying in order to have the long run equilibrium of a dynamic model independent of the growth rates of the variables out of sample period, i.e. to satisfy the dynamic homogeneity condition. It is found to be common when estimating ECMs on macroeconomic time series of converging countries. Dynamic homogeneity can be achieved by imposing the state dependent dynamic homogeneity restriction on the intercept. Applying the restriction is illustrated by an empirical example using Estonian data on real wages and labour productivity.
    Keywords: dynamic homogeneity, error correction models, forecasting
    JEL: C32 C51
    URL: http://d.repec.org/n?u=RePEc:eea:boewps:wp2007-01&r=for
  8. By: Leon, Costas
    Abstract: The Taylor equation is a simple monetary policy rule that determines the Central Bank’s policy rate as a function of inflation and output. A significant body of literature verifies the consistency of the Taylor rule with the data. However, recently there has been a growing literature regarding the validity of the estimated parameters due to the non-stationarity of the interest rate. In this paper I test the consistency of the Taylor rule with the Greek data for the period 1996-2004. It appears that the data do not support the Taylor rule in the sense that they do not form a cointegration set of variables. Therefore, the estimated parameters should be considered fragile and the forecasting for the interest rate as a function of inflation and output should not be expected to be adequately consistent with the actual data.
    Keywords: Taylor rule; Monetary policy; Central bank; EMU; Greece.
    JEL: F41 E58
    Date: 2006–08–31
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:1650&r=for
  9. By: Feng, Yuanhua; Yu, Keming
    Abstract: A new multivariate random walk model with slowly changing parameters is introduced and investigated in detail. Nonparametric estimation of local covariance matrix is proposed. The asymptotic distributions, including asymptotic biases, variances and covariances of the proposed estimators are obtained. The properties of the estimated value of a weighted sum of individual nonparametric estimators are also studied in detail. The integrated effect of the estimation errors from the estimation for the difference series to the integrated processes is derived. Practical relevance of the model and estimation is illustrated by application to several foreign exchange rates.
    Keywords: Multivariate time series; slowly changing vector random walk; local covariance matrix; kernel estimation; asymptotic properties; forecasting.
    JEL: C32 G00 C14
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:1597&r=for
  10. By: Troy Matheson; James Mitchell; Brian Silverstone (Reserve Bank of New Zealand)
    Abstract: The qualitative responses that firms give to business survey questions regarding changes in their own output provide a real-time signal of official output changes. The most commonly-used method to produce an aggregate quantitative indicator from business survey responses - the net balance, or diffusion index - has changed little in 40 years. It focuses on the proportion of survey respondents replying "up", "the same" or "down". This paper investigates whether an improved real-time signal of official output data changes can be derived from a recently advanced method on the aggregation of survey data from panel responses. It also considers the ability of survey data to anticipate revisions to official output data. We find, in a New Zealand application, that exploiting the panel dimension to qualitative survey data gives a better in-sample signal about official data than traditional methods. This is achieved by giving a higher weight to firms whose answers have a close link to official data than to those whose experiences correspond only weakly or not at all. Out-of-sample, it is less clear it matters how survey data are quantified with simpler and more parsimonious methods hard to improve. It is clear, nevertheless, that survey data, exploited in some form, help to explain revisions to official data.
    JEL: C35 C42 C53 C80
    Date: 2007–02
    URL: http://d.repec.org/n?u=RePEc:nzb:nzbdps:2007/02&r=for

This nep-for issue is ©2007 by Rob J Hyndman. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.