nep-ets New Economics Papers
on Econometric Time Series
Issue of 2009‒12‒05
eight papers chosen by
Yong Yin
SUNY at Buffalo

  1. Forecasting long memory time series under a break in persistence By Florian Heinen; Philipp Sibbertsen; Robinson Kruse
  2. Nearly Efficient Likelihood Ratio Tests for Seasonal Unit Roots By Michael Jansson; Morten Ørregaard Nielsen
  3. Jump-Robust Volatility Estimation using Nearest Neighbor Truncation By Torben G. Andersen; Dobrislav Dobrev; Ernst Schaumburg
  4. On Loss Functions and Ranking Forecasting Performances of Multivariate Volatility Models By Sébastien Laurent; Jeroen Rombouts; Francesco Violente
  5. Real Time Detection of Structural Breaks in GARCH Models By Zhongfang He; John M. Maheu
  6. Time-varying Multi-regime Models Fitting by Genetic Algorithms By Francesco Battaglia; Mattheos Protopapas
  7. Fractional Integration and Cointegration: Testing the Term Structure of Interest Rates By Marco R Barassi; Dayong Zhang
  8. Resilience of Volatility By Sergey S. Stepanov

  1. By: Florian Heinen (Leibniz University of Hannover); Philipp Sibbertsen (Leibniz University of Hannover); Robinson Kruse (Aarhus University and CREATES)
    Abstract: We consider the problem of forecasting time series with long memory when the memory parameter is subject to a structural break. By means of a large-scale Monte Carlo study we show that ignoring such a change in persistence leads to substantially reduced forecasting precision. The strength of this effect depends on whether the memory parameter is increasing or decreasing over time. A comparison of six forecasting strategies allows us to conclude that pre-testing for a change in persistence is highly recommendable in our setting. In addition we provide an empirical example which underlines the importance of our findings.
    Keywords: Long memory time series, Break in persistence, Structural change, Simulation, Forecasting competition
    JEL: C15 C22 C53
    Date: 2009–11–17
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-53&r=ets
  2. By: Michael Jansson (UC Berkeley and CREATES); Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: In an important generalization of zero frequency autore- gressive unit root tests, Hylleberg, Engle, Granger, and Yoo (1990) developed regression-based tests for unit roots at the seasonal frequencies in quarterly time series. We develop likelihood ratio tests for seasonal unit roots and show that these tests are "nearly efficient" in the sense of Elliott, Rothenberg, and Stock (1996), i.e. that their local asymptotic power functions are indistinguishable from the Gaussian power envelope. Currently available nearly efficient testing procedures for seasonal unit roots are regression-based and require the choice of a GLS detrending parameter, which our likelihood ratio tests do not.
    Keywords: Likelihood Ratio Test, Seasonal Unit Root Hypothesis
    JEL: C12 C22
    Date: 2009–11–24
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-55&r=ets
  3. By: Torben G. Andersen (Northwestern Univ., NBER, CREATES); Dobrislav Dobrev (Federal Reserve Board of Governors); Ernst Schaumburg (Federal Reserve Bank of New York)
    Abstract: We propose two new jump-robust estimators of integrated variance based on highfrequency return observations. These MinRV and MedRV estimators provide an attractive alternative to the prevailing bipower and multipower variation measures. Specifically, the MedRV estimator has better theoretical efficiency properties than the tripower variation measure and displays better finite-sample robustness to both jumps and the occurrence of “zero” returns in the sample. Unlike the bipower variation measure, the new estimators allow for the development of an asymptotic limit theory in the presence of jumps. Finally, they retain the local nature associated with the low order multipower variation measures. This proves essential for alleviating finite sample biases arising from the pronounced intraday volatility pattern which afflict alternative jump-robust estimators based on longer blocks of returns. An empirical investigation of the Dow Jones 30 stocks and an extensive simulation study corroborate the robustness and efficiency properties of the new estimators.
    Keywords: High-frequency data, Integrated variance, Finite activity jumps, Realized volatility, Jump robustness, Nearest neighbor truncation
    JEL: C14 C15 C22 C80 G10
    Date: 2009–10–31
    URL: http://d.repec.org/n?u=RePEc:aah:create:2009-52&r=ets
  4. By: Sébastien Laurent; Jeroen Rombouts; Francesco Violente
    Abstract: A large number of parameterizations have been proposed to model conditional variance dynamics in a multivariate framework. However, little is known about the ranking of multivariate volatility models in terms of their forecasting ability. The ranking of multivariate volatility models is inherently problematic because it requires the use of a proxy for the unobservable volatility matrix and this substitution may severely affect the ranking. We address this issue by investigating the properties of the ranking with respect to alternative statistical loss functions used to evaluate model performances. We provide conditions on the functional form of the loss function that ensure the proxy-based ranking to be consistent for the true one - i.e., the ranking that would be obtained if the true variance matrix was observable. We identify a large set of loss functions that yield a consistent ranking. In a simulation study, we sample data from a continuous time multivariate diffusion process and compare the ordering delivered by both consistent and inconsistent loss functions. We further discuss the sensitivity of the ranking to the quality of the proxy and the degree of similarity between models. An application to three foreign exchange rates, where we compare the forecasting performance of 16 multivariate GARCH specifications, is provided. <P>Un grand nombre de méthodes de paramétrage ont été proposées dans le but de modéliser la dynamique de la variance conditionnelle dans un cadre multivarié. Toutefois, on connaît peu de choses sur le classement des modèles de volatilité multivariés, du point de vue de leur capacité à permettre de faire des prédictions. Le classement des modèles de volatilité multivariés est forcément problématique du fait qu’il requiert l’utilisation d’une valeur substitutive pour la matrice de la volatilité non observable et cette substitution peut influencer sérieusement le classement. Nous abordons ce problème en examinant les propriétés du classement en relation avec les fonctions de perte statistiques alternatives utilisées pour évaluer la performance des modèles. Nous présentons des conditions liées à la forme fonctionnelle de la fonction de perte qui garantissent que le classement fondé sur une valeur de substitution est constant par rapport au classement réel, c’est-à-dire à celui qui serait obtenu si la matrice de variance réelle était observable. Nous établissons un vaste ensemble de fonctions de perte qui produisent un classement constant. Dans le cadre d’une étude par simulation, nous fournissons un échantillon de données à partir d’un processus de diffusion multivarié en temps continu et comparons l’ordre généré par les fonctions de perte constantes et inconstantes. Nous approfondissons la question de la sensibilité du classement à la qualité de la substitution et le degré de ressemblance entre les modèles. Une application à trois taux de change est proposée et, dans ce contexte, nous comparons l’efficacité de prédiction de 16 paramètres du modèle GARCH multivarié (approche d’hétéroscédasticité conditionnelle autorégressive généralisée).
    Keywords: Volatility, multivariate GARCH, matrix norm, loss function, model confidence set, Volatilité, modèle GARCH multivarié, norme matricielle, fonction de perte, ensemble de modèles de confiance.
    Date: 2009–11–01
    URL: http://d.repec.org/n?u=RePEc:cir:cirwor:2009s-45&r=ets
  5. By: Zhongfang He; John M. Maheu
    Abstract: A sequential Monte Carlo method for estimating GARCH models subject to an unknown number of structural breaks is proposed. Particle filtering techniques allow for fast and efficient updates of posterior quantities and forecasts in real time. The method conveniently deals with the path dependence problem that arises in these type of models. The performance of the method is shown to work well using simulated data. Applied to daily NASDAQ returns, the evidence favors a partial structural break specification in which only the intercept of the conditional variance equation has breaks compared to the full structural break specification in which all parameters are subject to change. The empirical application underscores the importance of model assumptions when investigating breaks. A model with normal return innovations results in strong evidence of breaks; while more flexible return distributions such as t-innovations or a GARCH-jump mixture model still favors breaks but indicates much more uncertainty regarding the time and impact of them.
    Keywords: Econometric and statistical methods; Financial markets
    JEL: C11 C15 C22 C53
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:bca:bocawp:09-31&r=ets
  6. By: Francesco Battaglia; Mattheos Protopapas
    Abstract: Many time series exhibit both nonlinearity and nonstationarity. Though both features have often been taken into account separately, few attempts have been proposed to model them simultaneously. We consider threshold models, and present a general model allowing for different regimes both in time and in levels, where regime transitions may happen according to self-exciting, or smoothly varying, or piecewise linear threshold modeling. Since fitting such a model involves the choice of a large number of structural parameters, we propose a procedure based on genetic algorithms, evaluating models by means of a generalized identification criterion. The performance of the proposed procedure is illustrated with a simulation study and applications to some real data.
    Keywords: Nonlinear time series; Nonstationary time series; Threshold model
    Date: 2009–02–20
    URL: http://d.repec.org/n?u=RePEc:com:wpaper:009&r=ets
  7. By: Marco R Barassi; Dayong Zhang
    Abstract: The expectation hypothesis suggests there exists long run equilibrium of interest rate term structure. Two theoretical approaches proposed by Campbell and Shiller (1987) and Hall el al. (1992) suggest that the term spread of long-term and short-term interest rates should be a stationary I(0) process. However, an empirically non stationary term spread or rejection of cointegration between long and short interest rates, in the traditional sense need not to be considered against the simple theoretical model. It is likely that the dichotomy between I(1) or I(0) and/or integer values of cointegration are environments which are too restrictive to model the term structure. In this paper, we evaluate and apply some recent techniques on testing fractional integration and propose the use of a residual based approach which uses the Exact Local Whittle Estimator. The method is then used to investigate the term structure in the UK and the US.
    Keywords: Term Structure, Long Memory, Fractional Integration, Fractional Cointegration, Local Whittle Estimation
    JEL: C22 E43
    Date: 2009–11
    URL: http://d.repec.org/n?u=RePEc:bir:birmec:09-17&r=ets
  8. By: Sergey S. Stepanov
    Abstract: The problem of non-stationarity in financial markets is discussed and related to the dynamic nature of price volatility. A new measure is proposed for estimation of the current asset volatility. A simple and illustrative explanation is suggested of the emergence of significant serial autocorrelations in volatility and squared returns. It is shown that when non-stationarity is eliminated, the autocorrelations substantially reduce and become statistically insignificant. The causes of non-Gaussian nature of the probability of returns distribution are considered. For both stock and currency markets data samples, it is shown that removing the non-stationary component substantially reduces the kurtosis of distribution, bringing it closer to the Gaussian one. A statistical criterion is proposed for controlling the degree of smoothing of the empirical values of volatility. The hypothesis of smooth, non-stochastic nature of volatility is put forward, and possible causes of volatility shifts are discussed.
    Date: 2009–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0911.5048&r=ets

This nep-ets issue is ©2009 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.