
on Econometric Time Series 
By:  Leon Bettendorf (Faculty of Economics, Erasmus Universiteit Rotterdam); Stephanie van der Geest (Faculty of Economics, Erasmus Universiteit Rotterdam); Gerard Kuper (University of Groningen) 
Abstract:  This paper analyzes adjustments in the Dutch retail gasoline prices. We estimate an error correction model on changes in the daily retail price for gasoline (taxes excluded) for the period 19962004 taking care of volatility clustering by estimating an EGARCH model. It turns out the volatility process is asymmetrical: an unexpected increase in the producer price has a larger effect on the variance of the producer price than an unexpected decrease. We do not find strong evidence for amount asymmetry. However, there is a faster reaction to upward changes in spot prices than to downward changes in spot prices. This implies timing or pattern asymmetry. This asymmetry starts three days after the change in the spot price and lasts for four days. 
Keywords:  Asymmetry; Retail gasoline prices; Volatility 
JEL:  D43 E31 
Date:  2005–04–22 
URL:  http://d.repec.org/n?u=RePEc:dgr:uvatin:20050040&r=ets 
By:  Jesús FernándezVillaverde; Juan Francisco RubioRamírez; Manuel Santos 
Abstract:  This paper studies the econometrics of computed dynamic models. Since these models generally lack a closedform solution, economists approximate the policy functions of the agents in the model with numerical methods. But this implies that, instead of the exact likelihood function, the researcher can evaluate only an approximated likelihood associated with the approximated policy function. What are the consequences for inference of the use of approximated likelihoods? First, we show that as the approximated policy function converges to the exact policy, the approximated likelihood also converges to the exact likelihood. Second, we prove that the approximated likelihood converges at the same rate as the approximated policy function. Third, we find that the error in the approximated likelihood gets compounded with the size of the sample. Fourth, we discuss convergence of Bayesian and classical estimates. We complete the paper with three applications to document the quantitative importance of our results. 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200427&r=ets 
By:  Peter Reinhard Hansen; Asger Lunde; James M. Nason 
Abstract:  The paper introduces the model confidence set (MCS) and applies it to the selection of forecasting models. An MCS is a set of models that is constructed so that it will contain the “best” forecasting model, given a level of confidence. Thus, an MCS is analogous to a confidence interval for a parameter. The MCS acknowledges the limitations of the data so that uninformative data yield an MCS with many models, whereas informative data yield an MCS with only a few models. We revisit the empirical application in Stock and Watson (1999) and apply the MCS procedure to their set of inflation forecasts. In the first pre1984 subsample we obtain an MCS that contains only a few models, notably versions of the SolowGordon Phillips curve. On the other hand, the second post1984 subsample contains little information and results in a large MCS. Yet, the random walk forecast is not contained in the MCS for either of the samples. This outcome shows that the random walk forecast is inferior to inflation forecasts based on Phillips curvelike relationships. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200507&r=ets 
By:  Jesús FernándezVillaverde; Juan Francisco RubioRamírez; Thomas Sargent 
Abstract:  The dynamics of a linear (or linearized) dynamic stochastic economic model can be expressed in terms of matrices (A, B, C, D) that define a statespace system. An associated state space system (A, K, C, S) determines a vector autoregression (VAR) for observables available to an econometrician. We review circumstances in which the impulse response of the VAR resembles the impulse response associated with the economic model. We give four examples that illustrate a simple condition for checking whether the mapping from VAR shocks to economic shocks is invertible. The condition applies when there are equal numbers of VAR and economic shocks. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedawp:200509&r=ets 
By:  Ben R. Craig; Joachim G. Keller 
Abstract:  We estimate the process underlying the pricing of American options by using higherorder lattices combined with a multigrid method. This paper also tests whether the riskneutral densities given from American options provide a good forecasting tool. We use a nonparametric test of the densities that is based on the inverse probability functions and is modified to account for correlation across time between our random variables, which are uniform under the null hypothesis. We find that the densities based on the American option markets for foreign exchange do quite well for the forecasting period over which the options are thickly traded. Further, simple models that fit the densities do about as well as more sophisticated models. 
Keywords:  Foreign exchange futures ; Options (Finance) ; Economic forecasting 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedcwp:0409&r=ets 
By:  N. Kundan Kishor; Evan F. Koenig 
Abstract:  Conventional VAR estimation and forecasting ignores the fact that economic data are often subject to revision many months or years after their initial release. This paper shows how VAR analysis can be modified to account for such revisions. The proposed approach assumes that government statistical releases are efficient with a finite lag. It takes no stand on whether earlier revisions are “noise” or “news.” The technique is illustrated using data on employment and the unemployment rate, real GDP and the unemployment rate, and real GDP and the GDP/consumption ratio. In each case, the proposed procedure outperforms conventional VAR analysis and the morerestrictive methods for handling the datarevision problem that are found in the existing literature. 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:feddwp:0501&r=ets 
By:  Tim Bollerslev; Michael Gibson; Hao Zhou 
Abstract:  This paper proposes a method for constructing a volatility risk premium, or investor risk aversion, index. The method is intuitive and simple to implement, relying on the sample moments of the recently popularized modelfree realized and optionimplied volatility measures. A smallscale Monte Carlo experiment suggests that the procedure works well in practice. Implementing the procedure with actual S&P 500 optionimplied volatilities and highfrequency fiveminutebased realized volatilities results in significant temporal dependencies in the estimated stochastic volatility risk premium, which we in turn relate to a set of underlying macrofinance state variables. We also find that the extracted volatility risk premium helps predict future stock market returns. 
Keywords:  Stochastic analysis ; Risk ; Uncertainty 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:200456&r=ets 
By:  Mark W. French 
Abstract:  The cycle in output and hours worked is not symmetric: it behaves differently around recessions than in expansions. Similarly, the trend in multifactor productivity (MFP) seems to pass through different regimes; there was an extended period of slow MFP growth from about 1973 through 1995, and faster growth thereafter. Typical linear models and linear filters such as the Kalman filter deal poorly with asymmetry and regime changes. This paper attempts to determine more accurately and quickly any shifts in trend MFP growth, using a nonlinear Kalman/Markov filter with a model of the unobserved components of output and hours. This hybrid model incorporates regimeswitching in the business cycle and in the trend growth of MFP. Estimation results are promising. The hybrid model and associated filter appear to be faster than the basic Kalman filter in detecting turning points in the smoothed conditional mean estimate of trend MFP growth; in addition, the hybrid model avoids some of the Kalman filter's biases in reconstructing historical business cycles and the MFP trend. 
Keywords:  Business cycles ; Econometric models ; Nonlinear theories 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:200512&r=ets 
By:  Todd E. Clark; Michael W. McCracken 
Abstract:  This paper presents analytical, Monte Carlo, and empirical evidence on the effectiveness of combining recursive and rolling forecasts when linear predictive models are subject to structural change. We first provide a characterization of the biasvariance tradeoff faced when choosing between either the recursive and rolling schemes or a scalar convex combination of the two. From that, we derive pointwise optimal, timevarying and datadependent observation windows and combining weights designed to minimize mean square forecast error. We then proceed to consider other methods of forecast combination, including Bayesian methods that shrink the rolling forecast to the recursive and Bayesian model averaging. Monte Carlo experiments and several empirical examples indicate that although the recursive scheme is often difficult to beat, when gains can be obtained, some form of shrinkage can often provide improvements in forecast accuracy relative to forecasts made using the recursive scheme or the rolling scheme with a fixed window width. 
Keywords:  Forecasting 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedkrw:rwp0410&r=ets 
By:  Siddhartha Chib; Michael J. Dueker 
Abstract:  This article presents a nonMarkovian regime switching model in which the regime states depend on the sign of an autoregressive latent variable. The magnitude of the latent variable indexes the 'strength' of the state or how deeply the system is embedded in the current regime. In this model, regimes have dynamics, not only persistence, so that one regime can gradually give way to another. In this framework, it is natural to allow the autoregressive latent variable to be endogenous so that regimes are determined jointly with the observed data. We apply the model to GDP growth, as in Hamilton (1989), Albert and Chib (1993) and Filardo and Gordon (1998) to illustrate the relation of the regimes to NBERdated recessions and the timevarying expected durations of regimes. The article makes use of the MetropolisHastings algorithm to make multimove draws of the latent regime strength variable, where the extended Kalman filter provides a valid proposal density for the latent variable. 
Keywords:  Timeseries analysis ; Business cycles 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fedlwp:2004030&r=ets 
By:  Massimo Guidolin; Allan Timmerman 
Abstract:  This paper considers a variety of econometric models for the joint distribution of US stock and bond returns in the presence of regime switching dynamics. While simple two or threestate models capture the univariate dynamics in bond and stock returns, a more complicated four state model with regimes characterized as crash, slow growth, bull and recovery states is required to capture their joint distribution. The transition probability matrix of this model has a very particular form. Exits from the crash state are almost always to the recovery state and occur with close to 50 percent chance suggesting a bounceback effect from the crash to the recovery state. 
Keywords:  Timeseries analysis ; Stocks ; Bond market 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedlwp:2005003&r=ets 
By:  Silvia Goncalves; Massimo Guidolin 
Abstract:  One key stylized fact in the empirical option pricing literature is the existence of an implied volatility surface (IVS). The usual approach consists of fitting a linear model linking the implied volatility to the time to maturity and the moneyness, for each cross section of options data. However, recent empirical evidence suggests that the parameters characterizing the IVS change over time. In this paper we study whether the resulting predictability patterns in the IVS coefficients may be exploited in practice. We propose a twostage approach to modeling and forecasting the S&P 500 index options IVS. In the first stage we model the surface along the crosssectional moneyness and timetomaturity dimensions, similarly to Dumas et al. (1998). In the secondstage we model the dynamics of the crosssectional firststage implied volatility surface coefficients by means of vector autoregression models. We find that not only the S&P 500 implied volatility surface can be successfully modeled, but also that its movements over time are highly predictable in a statistical sense. We then examine the economic significance of this statistical predictability with mixed findings. Whereas profitable deltahedged positions can be set up that exploit the dynamics captured by the model under moderate transaction costs and when trading rules are selective in terms of expected gains from the trades, most of this profitability disappears when we increase the level of transaction costs and trade multiple contracts off wide segments of the IVS. This suggests that predictability of the timevarying S&P 500 implied volatility surface may be not inconsistent with market efficiency. 
Keywords:  Assets (Accounting) ; Prices 
Date:  2005 
URL:  http://d.repec.org/n?u=RePEc:fip:fedlwp:2005010&r=ets 
By:  Gary M. Koop; Simon M. Potter 
Abstract:  This paper develops a new approach to changepoint modeling that allows for an unknown number of change points in the observed sample. Our model assumes that regime durations have a Poisson distribution. The model approximately nests the two most common approaches: the timevarying parameter model with a change point every period and the changepoint model with a small number of regimes. We focus on the construction of reasonable hierarchical priors both for regime durations and for the parameters that characterize each regime. A Markov Chain Monte Carlo posterior sampler is constructed to estimate a changepoint model for conditional means and variances. We find that our techniques work well in an empirical exercise involving U.S. inflation and GDP growth. Empirical results suggest that the number of change points is larger than previously estimated in these series and the implied model is similar to a timevarying parameter model with stochastic volatility. 
Keywords:  Econometric models ; Timeseries analysis 
Date:  2004 
URL:  http://d.repec.org/n?u=RePEc:fip:fednsr:196&r=ets 
By:  Robert F. Engle (New York University, Stern School of Business, Finance Department); Giampiero M. Gallo (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti") 
Abstract:  Many ways exist to measure and model financial asset volatility. In principle, as the frequency of the data increases, the quality of forecasts should improve. Yet, there is no consensus about a "true" or "best" measure of volatility. In this paper we propose to jointly consider absolute daily returns, daily highlow range and daily realized volatility to develop a forecasting model based on their conditional dynamics. As all are nonnegative series, we develop a multiplicative error model that is consistent and asymptotically normal under a wide range of specifications for the error density function. The estimation results show significant interactions between the indicators. We also show that onemonthahead forecasts match well (both in and out of sample) the marketbased volatility measure provided by an average of implied volatilities of index options as measured by VIX. 
Keywords:  volatility modeling, volatility forecasting, GARCH, VIX, highlow range, realized volatility. 
JEL:  C22 C32 C53 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2003_07&r=ets 
By:  Marco J. Lombardi (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti"); Simon J. Godsill (Cambridge University Engineering Department, Signal Processing Lab) 
Abstract:  In this paper we propose an online Bayesian filtering and smoothing method for time series models with heavytailed alphastable noise, with a particular focus on TVAR models. alphastable processes have been shown in the past to be a good model for many naturally occurring noise sources. We first point out how a filter that fails to take into account the heavytailed character of the noise performs poorly and then examine how an alphastable based particle filter can be devised to overcome this problem. The filtering methodology is based on a scale mixtures of normals (SMiN) representation of the alphastable distribution, which allows efficient RaoBlackwellised implementation within a conditionally Gaussian framework, and requires no direct evaluation of the alphastable density, which is in general unavailable in closed form. The methodology is shown to work well, outperforming the traditional Gaussian methods both on simulated data and on real audio data sets. 
Keywords:  Particle filters, Kalman filter, Alphastable distributions, Scale mixture of normals. 
Date:  2004–05–01 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2004_05&r=ets 
By:  Marco J. Lombardi (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti"); Giorgio Calzolari (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti") 
Abstract:  The alphastable family of distributions constitutes a generalization of the Gaussian distribution, allowing for asymmetry and thicker tails. Its practical usefulness is coupled with a marked theoretical appeal, as it stems from a generalized version of the central limit theorem in which the assumption of the finiteness of the variance is replaced by a less restrictive assumption concerning a somehow regular behavior of the tails. Estimation difficulties have however hindered its diffusion among practitioners. Since simulated values from alphastable distributions can be straightforwardly obtained, the indirect inference approach could prove useful to overcome these estimation difficulties. In this paper we provide a description of how to implement such a method by using a skewt distribution as an auxiliary model. The indirect inference approach will be introduced in the setting of the estimation of the distribution parameters and then extended to linear time series models with alphastable disturbances. The performance of this estimation method is then assessed on simulated data. An application on timeseries models for the inflation rate concludes the paper. 
Keywords:  Indirect inference, Alphastable distributions, Heavy tails. 
Date:  2004–06–01 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2004_07&r=ets 
By:  Marco J. Lombardi (Università degli Studi di Firenze, Dipartimento di Statistica "G. Parenti") 
Abstract:  The alphastable family of distributions constitutes a generalization of the Gaussian distribution, allowing for asymmetry and thicker tails. Its practical usefulness is coupled with a marked theoretical appeal, given that it stems from a generalized version of the central limit theorem in which the assumption of the finiteness of the variance is replaced by a less restrictive assumption concerning a somehow regular behavior of the tails. The absence of the density function in a closed form and the associated estimation difficulties have however hindered its diffusion among practitioners. In this paper I introduce a novel approach for Bayesian inference in the setting of alphastable distributions that resorts to a FFT of the characteristic function in order to approximate the likelihood function; the posterior distributions of the parameters are then produced via a random walk MCMC method. Contrary to the other MCMC schemes proposed in the literature, the proposed approach does not require auxiliary variables, and so it is less computationally expensive, especially when large sample sizes are involved. A simulation exercise highlights the empirical properties of the sampler; an application on audio noise data demonstrates how this estimation scheme performs in practical applications. 
Keywords:  Alphastable distributions, Infinite variance, MCMC. 
Date:  2004–09–01 
URL:  http://d.repec.org/n?u=RePEc:fir:econom:wp2004_11&r=ets 
By:  Jan G. De Gooijer; Rob J. Hyndman 
Abstract:  We review the past 25 years of time series research that has been published in journals managed by the International Institute of Forecasters (Journal of Forecasting 19821985; International Journal of Forecasting 19852005). During this period, over one third of all papers published in these journals concerned time series forecasting. We also review highly influential works on time series forecasting that have been published elsewhere during this period. Enormous progress has been made in many areas, but we find that there are a large number of topics in need of further development. We conclude with comments on possible future research directions in this field. 
Keywords:  Accuracy measures; ARCH model; ARIMA model; Combining; Count data; Densities; Exponential smoothing; Kalman Filter; Long memory; Multivariate; Neural nets; Nonlinearity; Prediction intervals; Regime switching models; Robustness; Seasonality; State space; Structural models; Transfer function; Univariate; VAR. 
JEL:  C53 C22 C32 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:200512&r=ets 
By:  Rob J. Hyndman; Anne B. Koehler 
Abstract:  We discuss and compare measures of accuracy of univariate time series forecasts. The methods used in the Mcompetition and the M3competition, and many of the measures recommended by previous authors on this topic, are found to be inadequate, and many of them are degenerate in commonly occurring situations. Instead, we propose that the mean absolute scaled error become the standard measure for comparing forecast accuracy across multiple time series. 
Keywords:  Forecast accuracy, Forecast evaluation, Forecast error measures, Mcompetition, Mean absolute scaled error. 
JEL:  C53 C52 C22 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:200513&r=ets 
By:  George Kapetanios (Queen Mary, University of London); Elias Tzavalis (Queen Mary, University of London) 
Abstract:  This paper applies a new model of structural breaks developed by Kapetanios and Tzavalis (2004) to investigate if there exist structural changes in the mean reversion parameter of US macroeconomic series. Ignoring such type of breaks may lead to spurious evidence of unit roots in the autoregressive parameters of economic series. Our model specifies that both the timing and size of breaks are stochastic. We apply the model to a variety of macroeconomic and finance series from the US. 
Keywords:  Structural breaks, State space model, Nonlinearity. 
JEL:  E32 C13 C22 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp537&r=ets 
By:  George Kapetanios (Queen Mary, University of London) 
Abstract:  The problem of structural change justifiably attracts considerable attention in econometrics. A number of different paradigms have been adopted ranging from structural breaks which are sudden and rare to timevarying coefficient models which exhibit structural change more frequently and continuously. This paper is concerned with parametric econometric models whose coefficients change deterministically and smoothly over time. In particular we provide and discuss tests for the null hypothesis of no structural change versus the alternative hypothesis of smooth deterministic structural change. We provide asymptotic tests for this null hypothesis. However, the finite sample performance of these tests is not good as they overreject significantly. To address this problem we propose and justify bootstrap based tests. These tests perform well in an extensive Monte Carlo study. 
Keywords:  Structural change, Nonstationarity, Deterministic timevariation. 
JEL:  C10 C14 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp539&r=ets 
By:  George Kapetanios (Queen Mary, University of London) 
Abstract:  The problem of structural change justifiably attracts considerable attention in econometrics. A number of different paradigms have been adopted ranging from structural breaks which are sudden and rare to time varying coefficient models which exhibit structural change more frequently and continuously. This paper is concerned with parametric econometric models whose coefficients change deterministically and smoothly over time. In particular we provide a new estimator for unconditional time varying variances in regression models. A small Monte Carlo study indicates that the method works reasonably well for moderately large sample sizes. 
Keywords:  Structural change, Nonstationarity, Deterministic timevariation. 
JEL:  C10 C14 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp540&r=ets 
By:  Gonzalo CambaMendez (European Central Bank); George Kapetanios (Queen Mary, University of London) 
Abstract:  Testing the rank of a matrix of estimated parameters is key in a large variety of econometric modelling scenarios. This paper describes general methods to test for the rank of a matrix, and provides details on a variety of modelling scenarios in the econometrics literature where these tests are required. 
Keywords:  Multiple time series, Model specification, Tests of rank. 
JEL:  C12 C15 C32 
Date:  2005–05 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp541&r=ets 
By:  Hashem Dezhbakhsh (Emory University); Daniel Levy (BarIlan University) 
Abstract:  Although linearly interpolated series are often used in economics, little has been done to examine the effects of interpolation on time series properties and on statistical inference. We show that linear interpolation of a trend tationary series superimposes a ‘periodic’ structure on the moments of the series. Using conventional time series methods to make inference about the interpolated series may therefore be invalid. Also, the interpolated series may exhibit more shock persistence than the original trend stationary series. 
Keywords:  Linear Interpolation, TrendStationary Series, Shock Persistence, Periodic Properties of Time Series 
JEL:  C10 C22 C82 E37 
Date:  2005–05–15 
URL:  http://d.repec.org/n?u=RePEc:wpa:wuwpem:0505004&r=ets 
By:  Torben G. Andersen (Kellogg School of Management, Northwestern University and NBER); Tim Bollerslev (Department of Economics, Duke University and NBER); Francis X. Diebold (Department of Economics, University of Pennsylvania and NBER) 
Abstract:  A rapidly growing literature has documented important improvements in volatility measurement and forecasting performance through the use of realized volatilities constructed from high frequency returns coupled with relatively simple reducedform time series modeling procedures. Building on recent theoretical results from BarndorffNielsen and Shephard (2003c,d) for related bipower variation measures involving the sum of highfrequency absolute returns, the present paper provides a practical framework for nonparametrically measuring the jump component in realized volatility measurements. Exploiting these ideas for a decade of highfrequency fiveminute returns for the DM/$ exchange rate, the S&P500 market index, and the 30year U.S. Treasury bond yield, we find the jump component of the price process to be distinctly less persistent than the continuous sample path component. Explicitly including the jump measure as an additional explanatory variable in an easyto implement reduced form model for realized volatility results in highly significant jump coefficient estimates at the daily, weekly and quarterly forecast horizons. 
Keywords:  Continuoustime methods; jumps; quadratic variation; realized volatility; bipower variation; highfrequency data; volatility forecasting; HARRV model 
JEL:  C1 G1 
Date:  2003–02–01 
URL:  http://d.repec.org/n?u=RePEc:pen:papers:03025&r=ets 