nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒06‒25
twelve papers chosen by
Yong Yin
SUNY at Buffalo

  1. Conditional Correlation Models of Autoregressive Conditional Heteroskedasticity with Nonstationary GARCH Equations By Cristina Amado; Timo Teräsvirta
  2. A Functional Filtering and Neighborhood Truncation Approach to Integrated Quarticity Estimation By Torben G. Andersen; Dobrislav Dobrev; Ernst Schaumburg
  3. A Simple Test for Spurious Regressions By Antonio E. Noriega; Daniel Ventosa-Santaularia
  4. Out-of-Sample Forecast Tests Robust to Window Size Choice By Barbara Rossi; Atsushi Inoue
  5. Estimating the Leverage Parameter of Continuous-time Stochastic Volatility Models Using High Frequency S&P 500 and VIX By Isao Ishida; Michael McAleer; Kosuke Oya
  6. Convergence and Cointegration By Alfredo García-Hiernaux; David E. Guerrero
  7. Block Bootstrap and Long Memory By George Kapetanios; Fotis Papailias
  8. Persistence in Convergence By Thanasis Stengos; M. Ege Yazgan
  9. Cointegrating MiDaS Regressions and a MiDaS Test By J. Isaac Miller
  10. On the Short-Time Asymptotic Expansion of the Heat Kernel By Akihiko Takahashi; Toshihiro Yamada
  11. Goodness-of-Fit tests with Dependent Observations By Remy Chicheportiche; Jean-Philippe Bouchaud
  12. Hidden panel cointegration By Abdulnasser, Hatemi-J

  1. By: Cristina Amado (University of Minho and NIPE); Timo Teräsvirta (Aarhus University, School of Economics and Management and CREATES)
    Abstract: In this paper we investigate the effects of careful modelling the long-run dynamics of the volatil- ities of stock market returns on the conditional correlation structure. To this end we allow the individual unconditional variances in Conditional Correlation GARCH models to change smoothly over time by incorporating a nonstationary component in the variance equations. The modelling technique to determine the parametric structure of this time-varying component is based on a sequence of specification Lagrange multiplier-type tests derived in Amado and Teräsvirta (2011). The variance equations combine the long-run and the short-run dynamic behaviour of the volatilities. The structure of the conditional correlation matrix is assumed to be either time independent or to vary over time. We apply our model to pairs of seven daily stock returns belonging to the S&P 500 composite index and traded at the New York Stock Exchange. The results suggest that accounting for deterministic changes in the unconditional variances considerably improves the fit of the multivariate Conditional Correlation GARCH models to the data. The effect of careful specification of the variance equations on the estimated correlations is variable: in some cases rather small, in others more discernible. As a by-product, we generalize news impact surfaces to the situation in which both the GARCH equations and the conditional correlations contain a deterministic component that is a function of time.
    Keywords: Multivariate GARCH model, Time-varying unconditional variance, Lagrange multiplier test, Modelling cycle, Nonlinear time series.
    JEL: C12 C32 C51 C52
    Date: 2011–05–30
    URL: http://d.repec.org/n?u=RePEc:aah:create:2011-24&r=ets
  2. By: Torben G. Andersen (Northwestern University, NBER, and CREATES); Dobrislav Dobrev (Federal Reserve Board of Governors); Ernst Schaumburg (Federal Reserve Bank of New York)
    Abstract: We provide a first in-depth look at robust estimation of integrated quarticity (IQ) based on high frequency data. IQ is the key ingredient enabling inference about volatility and the presence of jumps in financial time series and is thus of considerable interest in applications. We document the significant empirical challenges for IQ estimation posed by commonly encountered data imperfections and set forth three complementary approaches for improving IQ based inference. First, we show that many common deviations from the jump diffusive null can be dealt with by a novel filtering scheme that generalizes truncation of individual returns to truncation of arbitrary functionals on return blocks. Second, we propose a new family of efficient robust neighborhood truncation (RNT) estimators for integrated power variation based on order statistics of a set of unbiased local power variation estimators on a block of returns. Third, we find that ratio-based inference, originally proposed in this context by Barndorff-Nielsen and Shephard (2002), has desirable robustness properties in the face of regularly occurring data imperfections and thus is well suited for our empirical applications. We confirm that the proposed filtering scheme and the RNT estimators perform well in our extensive simulation designs and in an application to the individual Dow Jones 30 stocks.
    Keywords: Neighborhood Truncation Estimator, Functional Filtering, Integrated Quarticity, Inference on Integrated Variance, High-Frequency Data
    JEL: C14 C15 C22 C80 G10
    Date: 2011–05–29
    URL: http://d.repec.org/n?u=RePEc:aah:create:2011-23&r=ets
  3. By: Antonio E. Noriega (Dirección General de Investigación Económica, Banco de México and Departamento de Department of Economics and Finance, Universidad de Guanajuato); Daniel Ventosa-Santaularia (Department of Economics and Finance, Universidad de Guanajuato)
    Abstract: The literature on spurious regressions has found that the t-statistic for testing the null of no relationship between two independent variables diverges asymptotically under a wide variety of nonstationary data generating processes for the dependent and explanatory variables. This paper introduces a simple method which guarantees convergence of this t-statistic to a pivotal limit distribution, when there are drifts in the integrated processes generating the data, thus allowing asymptotic inference. We show that this method can be used to distinguish a genuine relationship from a spurious one among integrated (I(1) and I(2)) processes. Simulation experiments show that the test has good size and power properties in small samples. We apply the proposed procedure to several pairs of apparently independent integrated variables (including the marriages and mortality data of Yule, 1926), and find that our procedure, in contrast to standard ordinary least squares regression, does not find (spurious) significant relationships between the variables.
    Keywords: Spurious regression, integrated process, detrending, Cointegration
    JEL: C12 C15 C22 C46
    Date: 2011–05–01
    URL: http://d.repec.org/n?u=RePEc:aah:create:2011-15&r=ets
  4. By: Barbara Rossi; Atsushi Inoue
    Abstract: This paper proposes new methodologies for evaluating out-of-sample forecasting performance that are robust to the choice of the estimation window size. The methodologies involve evaluating the predictive ability of forecasting models over a wide range of window sizes. We show that the tests proposed in the literature may lack power to detect predictive ability, and might be subject to data snooping across different window sizes if used repeatedly. An empirical application shows the usefulness of the methodologies for evaluating exchange rate models' forecasting ability.
    Keywords: Predictive Ability Testing, Forecast Evaluation, Estimation Window
    JEL: C22 C52 C53
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:duk:dukeec:11-04&r=ets
  5. By: Isao Ishida (Center for the Study of Finance and Insurance Osaka University, Japan); Michael McAleer (Econometrisch Instituut (Econometric Institute), Faculteit der Economische Wetenschappen (Erasmus School of Economics), Erasmus Universiteit, Tinbergen Instituut (Tinbergen Institute).); Kosuke Oya (Graduate School of Economics and Center for the Study of Finance and Insurance Osaka University, Japan)
    Abstract: This paper proposes a new method for estimating continuous-time stochastic volatility (SV) models for the S&P 500 stock index process using intraday high-frequency observations of both the S&P 500 index and the Chicago Board of Exchange (CBOE) implied (or expected) volatility index (VIX). Intraday high-frequency observations data have become readily available for an increasing number of financial assets and their derivatives in recent years, but it is well known that attempts to directly apply popular continuous-time models to short intraday time intervals, and estimate the parameters using such data, can lead to nonsensical estimates due to severe intraday seasonality. A primary purpose of the paper is to provide a framework for using intraday high frequency data of both the index estimate, in particular, for improving the estimation accuracy of the leverage parameter, , that is, the correlation between the two Brownian motions driving the diffusive components of the price process and its spot variance process, respectively. As a special case, we focus on Heston’s (1993) square-root SV model, and propose the realized leverage estimator for , noting that, under this model without measurement errors, the “realized leverage,” or the realized covariation of the price and VIX processes divided by the product of the realized volatilities of the two processes, is in-fill consistent for  . Finite sample simulation results show that the proposed estimator delivers more accurate estimates of the leverage parameter than do existing methods.
    Keywords: Continuous time, high frequency data, stochastic volatility, S&P 500, implied volatility, VIX.
    JEL: G13 G32
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1117&r=ets
  6. By: Alfredo García-Hiernaux (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid); David E. Guerrero
    Abstract: This paper provides a new, uni¯ed, and °exible framework to measure and characterize convergence in prices. We formally de¯ne this notion and propose a model to represent a wide range of transition paths that converge to a common steady-state. Our framework enables the econometric measurement of such transi- tional behaviors and the development of testing procedures. Speci¯cally, we derive a statistical test to determine whether convergence exists and, if so, which type: as catching-up or steady-state. The application of this methodology to historic wheat prices results in a novel explanation of the convergence processes experienced during the 19th century.
    Keywords: Price convergence, cointegration, law of one price.
    JEL: C22 C32 N70 F15
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1122&r=ets
  7. By: George Kapetanios (Queen Mary, University of London); Fotis Papailias (Queen Mary, University of London)
    Abstract: We consider the issue of Block Bootstrap methods in processes that exhibit strong dependence. The main difficulty is to transform the series in such way that implementation of these techniques can provide an accurate approximation to the true distribution of the test statistic under consideration. The bootstrap algorithm we suggest consists of the following operations: given <i>x<sub>t</sub> ~ I(d<sub>0</sub>)</i>, 1) estimate the long memory parameter and obtain <i>dˆ</i>, 2) difference the series <i>dˆ</i> times, 3) apply the block bootstrap on the above and finally, 4) cumulate the bootstrap sample <i>dˆ</i> times. Repetition of steps 3 and 4 for a sufficient number of times, results to a successful estimation of the distribution of the test statistic. Furthermore, we establish the asymptotic validity of this method. Its finite-sample properties are investigated via Monte Carlo experiments and the results indicate that it can be used as an alternative, and in most of the cases to be preferred than the Sieve <i>AR</i> bootstrap for fractional processes.
    Keywords: Block Bootstrap, Long memory; Resampling, Strong dependence
    JEL: C15 C22 C63
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp679&r=ets
  8. By: Thanasis Stengos (University of Guelph); M. Ege Yazgan (Istanbul Bilgi University)
    Abstract: In this paper, we examine the convergence hypothesis using a long memory framework that allows for structural breaks and the non reliance on a benchmark country. We find that even though the long memory framework of analysis is much richer than the simple I(1)/I(0) alternative, a simple absolute divergence and rapid convergence dichotomy produced by the latter is sufficient to capture the behavior of the gaps in per capita GDP levels and growth rates results respectively. This is in contrast to the findings of Dufrénot, Mignon and Naccache (2009) who found strong evidence of long memory for output gaps. The speed of convergence captured by the estimated long memory parameter d, is explained by differences in physical and human capital as well as fiscal discipline characteristics of economic policies pursued by different countries.
    Keywords: Growth Convergence, Long Memory
    JEL: C32 O47
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:gue:guelph:2011-05.&r=ets
  9. By: J. Isaac Miller (Department of Economics, University of Missouri-Columbia)
    Abstract: This paper introduces cointegrating mixed data sampling (CoMiDaS) regressions, generalizing nonlinear MiDaS regressions in the extant literature. Under a linear mixed-frequency data-generating process, MiDaS regressions provide a parsimoniously parameterized nonlinear alternative when the linear forecasting model is over-parameterized and may be infeasible. In spite of potential correlation of the error term both serially and with the regressors, I find that nonlinear least squares consistently estimates the minimum mean-squared forecast error parameter vector. The exact asymptotic distribution of the difference may be non-standard. I propose a novel testing strategy for nonlinear MiDaS and CoMiDaS regressions against a general but possibly infeasible linear alternative. An empirical application to nowcasting global real economic activity using monthly covariates illustrates the utility of the approach.
    Keywords: cointegration, mixed-frequency series, mixed data sampling
    JEL: C12 C13 C22
    Date: 2011–06–14
    URL: http://d.repec.org/n?u=RePEc:umc:wpaper:1104&r=ets
  10. By: Akihiko Takahashi (Faculty of Economics, University of Tokyo); Toshihiro Yamada (Mitsubishi UFJ Trust Investment Technology Institute Co.,Ltd. (MTEC))
    Abstract: This paper proposes a new short-time asymptotic expansion of the heat kernel by an extension of LLeandrefs approach and the Bismut identiy in Malliavin calculus. The method is applied to general timehomogenous local volatility and local-stochastic volatility models in finance which include Heston (Heston (1993)) and (ă-)SABR models (Hagan et.al. (2002), Labordere (2008)) as special cases. Some numerical examples are shown.
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:cfi:fseres:cf249&r=ets
  11. By: Remy Chicheportiche; Jean-Philippe Bouchaud
    Abstract: We revisit the Kolmogorov-Smirnov and Cram\'er-von Mises goodness-of-fit (GoF) tests and propose a generalisation to identically distributed, but dependent univariate random variables. We show that the dependence leads to a reduction of the "effective" number of independent observations. The generalised GoF tests are not distribution-free but rather depend on all the lagged bivariate copulas. These objects, that we call "self-copulas", encode all the non-linear temporal dependences. We introduce a specific, log-normal model for these self-copulas, for which a number of analytical results are derived. An application to financial time series is provided. As is well known, the dependence is to be long-ranged in this case, a finding that we confirm using self-copulas. As a consequence, the acceptance rates for GoF tests are substantially higher than if the returns were iid random variables.
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1106.3016&r=ets
  12. By: Abdulnasser, Hatemi-J
    Abstract: This article extends the seminal work of Granger and Yoo (2002) on hidden cointegration to panel data analysis. It shows how cumulative negative and positive changes can be constructed for each panel variable. It also shows how tests similar to the augmented Dickey-Fuller tests can be implemented to find out whether the cointegration is hidden in the panel or not. An application is provided to investigate the impact of permanent positive and negative shocks in the government expenditure on the national output in a panel of three countries.
    Keywords: Asymmetry; Panel Data; Cointegration; Testing; Government Spending; Output
    JEL: H21 C33
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:31604&r=ets

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.