nep-ets New Economics Papers
on Econometric Time Series
Issue of 2008‒05‒31
four papers chosen by
Yong Yin
SUNY at Buffalo

  1. General to specific modelling of exchange rate volatility : a forecast evaluation By Luc Bauwens; Genaro Sucarrat
  2. Optimal Bandwidth Choice for Interval Estimation in GMM Regression By Yixiao Sun; Peter C.B. Phillips
  3. Is Double Trouble? How to Combine Cointegration Tests By Bayer Christian; Hanck Christoph
  4. Comment on "Regression with slowly varying regressors and nonlinear trends" by P.C.B. Phillips By Mynbaev, Kairat

  1. By: Luc Bauwens; Genaro Sucarrat
    Abstract: The general-to-specific (GETS) methodology is widely employed in the modelling of economic series, but less so in financial volatility modelling due to computational complexity when many explanatory variables are involved. This study proposes a simple way of avoiding this problem when the conditional mean can appropriately be restricted to zero, and undertakes an out-of-sample forecast evaluation of the methodology applied to the modelling of weekly exchange rate volatility. Our findings suggest that GETS specifications perform comparatively well in both ex post and ex ante forecasting as long as sufficient care is taken with respect to functional form and with respect to how the conditioning information is used. Also, our forecast comparison provides an example of a discrete time explanatory model being more accurate than realised volatility ex post in 1 step forecasting.
    Keywords: Exchange rate volatility, General to specific, Forecasting
    JEL: C53 F31
    Date: 2008–04
  2. By: Yixiao Sun (University of California, San Diego); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: In time series regression with nonparametrically autocorrelated errors, it is now standard empirical practice to construct confidence intervals for regression coefficients on the basis of nonparametrically studentized t-statistics. The standard error used in the studentization is typically estimated by a kernel method that involves some smoothing process over the sample autocovariances. The underlying parameter (M) that controls this tuning process is a bandwidth or truncation lag and it plays a key role in the finite sample properties of tests and the actual coverage properties of the associated confidence intervals. The present paper develops a bandwidth choice rule for M that optimizes the coverage accuracy of interval estimators in the context of linear GMM regression. The optimal bandwidth balances the asymptotic variance with the asymptotic bias of the robust standard error estimator. This approach contrasts with the conventional bandwidth choice rule for nonparametric estimation where the focus is the nonparametric quantity itself and the choice rule balances asymptotic variance with squared asymptotic bias. It turns out that the optimal bandwidth for interval estimation has a different expansion rate and is typically substantially larger than the optimal bandwidth for point estimation of the standard errors. The new approach to bandwidth choice calls for refined asymptotic measurement of the coverage probabilities, which are provided by means of an Edgeworth expansion of the finite sample distribution of the nonparametrically studentized t-statistic. This asymptotic expansion extends earlier work and is of independent interest. A simple plug-in procedure for implementing this optimal bandwidth is suggested and simulations confirm that the new plug-in procedure works well in finite samples. Issues of interval length and false coverage probability are also considered, leading to a secondary approach to bandwidth selection with similar properties.
    Keywords: Asymptotic expansion, Bias, Confidence interval, Coverage probability, Edgeworth expansion, Lag kernel, Long run variance, Optimal bandwidth, Spectrum
    JEL: C13 C14 C22 C51
    Date: 2008–05
  3. By: Bayer Christian; Hanck Christoph (METEOR)
    Abstract: This paper suggests a combination procedure to exploit the imperfect correlation of cointegration tests to develop a more powerful meta test. To exemplify, we combine Engle and Granger (1987) and Johansen (1988) tests. Either of these underlying tests can be more powerful than the other one depending on the nature of the data-generating process. The new meta test is at least as powerful as the more powerful one of the underlying tests irrespective of the very nature of the data generating process. At the same time, our new meta test avoids the arbitrary decision which test to use if single test results conflict. Moreover it avoids the size distortion inherent in separately applying multiple tests for cointegration to the same data set. We apply our test to 143 data sets from published cointegration studies. There, in one third of all cases single tests give conflicting results whereas our meta tests provides an unambiguous test decision.
    Keywords: Economics ;
    Date: 2008
  4. By: Mynbaev, Kairat
    Abstract: Standardized slowly varying regressors are shown to be $L_p$-approximable. This fact allows one to relax the assumption on linear processes imposed in central limit results by P.C.B. Phillips, as well as provide alternative proofs for some other statements.
    Keywords: slowly varying regressors; central limit theorem; $L_p$-approximability
    JEL: C02 C01
    Date: 2007–09–01

This nep-ets issue is ©2008 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.