nep-ets New Economics Papers
on Econometric Time Series
Issue of 2006‒05‒20
nine papers chosen by
Yong Yin
SUNY at Buffalo

  1. Consistent estimation of the memory parameterfor nonlinear time series By Violetta Dalla; Liudas Giraitis; Javier Hidalgo
  2. ROOT-N-CONSISTENT ESTIMATION OF WEAKFRACTIONAL COINTEGRATION By Javier Hualde; Peter M Robinson
  3. Instrumental Variables Estimation of Stationaryand Nonstationary Cointegrating Regressions By M. Gerolimetto; Peter M Robinson
  4. Finite Sample Performance in CointegrationAnalysis of Nonlinear Time Series with LongMemory By Afonso Gonçalves da Silva; Peter M Robinson
  5. A Mixture Multiplicative Error Model for Realized Volatility By Markku Lanne
  6. Cointegration in Panel Data with Breaks and Cross-Section Dependence By Anindya Banerjee; Josep Lluís Carrion-i-Silvestre
  7. Backtesting VaR Accuracy: A New Simple Test By Christophe Hurlin; Sessi Tokpavi
  8. Generalized Dynamic Factor Model + GARCH <br> Exploiting Multivariate Information for Univariate Prediction By Lucia Alessi; Matteo Barigozzi; Marco Capasso
  9. The Econometric Analysis of Constructed Binary Time Series By Don Harding; Adrian Pagan

  1. By: Violetta Dalla; Liudas Giraitis; Javier Hidalgo
    Abstract: For linear processes, semiparametric estimation of the memory parameter, based on the log-periodogramand local Whittle estimators, has been exhaustively examined and their properties are well established.However, except for some specific cases, little is known about the estimation of the memory parameter fornonlinear processes. The purpose of this paper is to provide general conditions under which the localWhittle estimator of the memory parameter of a stationary process is consistent and to examine its rate ofconvergence. We show that these conditions are satisfied for linear processes and a wide class of nonlinearmodels, among others, signal plus noise processes, nonlinear transforms of a Gaussian process ?tandEGARCH models. Special cases where the estimator satisfies the central limit theorem are discussed. Thefinite sample performance of the estimator is investigated in a small Monte-Carlo study.
    Keywords: Long memory, semiparametric estimation, local Whittle estimator.
    JEL: C14 C22
    Date: 2006–01
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/497&r=ets
  2. By: Javier Hualde; Peter M Robinson
    Abstract: Empirical evidence has emerged of the possibility of fractional cointegration such that thegap, ß, between the integration order d of observable time series, and the integrationorder ? of cointegrating errors, is less than 0.5. This includes circumstances whenobservables are stationary or asymptotically stationary with long memory (so d < 1/2),and when they are nonstationary (so d 1/2). This "weak cointegration" contrastsstrongly with the traditional econometric prescription of unit root observables and shortmemory cointegrating errors, where ß = 1. Asymptotic inferential theory also differs fromthis case, and from other members of the class ß > 1/2, in particular=consistent - n andasymptotically normal estimation of the cointegrating vector ? is possible when ß < 1/2,as we explore in a simple bivariate model. The estimate depends on ? and d or, morerealistically, on estimates of unknown ? and d. These latter estimates need to beconsistent - n , and the asymptotic distribution of the estimate of ? is sensitive to theirprecise form. We propose estimates of ? and d that are computationally relativelyconvenient, relying on only univariate nonlinear optimization. Finite sample performanceof the methods is examined by means of Monte Carlo simulations, and severalapplications to empirical data included.
    Keywords: Fractional cointegration, Parametric estimation, Asymptotic normality.
    JEL: C32
    Date: 2006–03
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/499&r=ets
  3. By: M. Gerolimetto; Peter M Robinson
    Abstract: Instrumental variables estimation is classically employed to avoid simultaneousequations bias in a stable environment. Here we use it to improve upon ordinaryleast squares estimation of cointegrating regressions between nonstationaryand/or long memory stationary variables where the integration orders of regressorand disturbance sum to less than 1, as happens always for stationary regressors,and sometimes for mean-reverting nonstationary ones. Unlike in the classicalsituation, instruments can be correlated with disturbances and/or uncorrelated withregressors. The approach can also be used in traditional non-fractionalcointegrating relations. Various choices of instrument are proposed. Finite sampleperformance is examined.
    Keywords: Cointegration, Instrumental variables estimation, I(d) processes.
    JEL: C32
    Date: 2006–04
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/500&r=ets
  4. By: Afonso Gonçalves da Silva; Peter M Robinson
    Abstract: Nonlinear functions of multivariate financial time series can exhibit longmemory and fractional cointegration. However, tools for analysingthese phenomena have principally been justified under assumptionsthat are invalid in this setting. Determination of asymptotic theoryunder more plausible assumptions can be complicated and lengthy.We discuss these issues and present a Monte Carlo study, showingthat asymptotic theory should not necessarily be expected to provide agood approximation to finite-sample behaviour.
    Keywords: Fractional cointegration, memory estimation,stochastic volatility.
    JEL: C32
    Date: 2006–04
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/501&r=ets
  5. By: Markku Lanne
    Abstract: A multiplicative error model with time-varying parameters and an error term following a mixture of gamma distributions is introduced. The model is fitted to the daily realized volatility series of Deutschemark/Dollar and Yen/Dollar returns and is shown to capture the conditional distribution of these variables better than the commonly used ARFIMA model. The forecasting performance of the new model is found to be, in general, superior to that of the set of volatility models recently considered by Andersen et al. (2003) for the same data.
    Keywords: Mixture model, Realized volatility, Gamma distribution
    JEL: C22 C52 C53 G15
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2006/3&r=ets
  6. By: Anindya Banerjee; Josep Lluís Carrion-i-Silvestre
    Abstract: The power of standard panel cointegration statistics may be affected by misspecification errors if proper account is not taken of the presence of structural breaks in the data. We propose modifications to allow for one structural break when testing the null hypothesis of no cointegration that retain good properties in terms of empirical size and power. Response surfaces to approximate the finite sample moments that are required to implement the statistics are provided. Since panel cointegration statistics rely on the assumption of cross-section independence, a generalisation of the tests to the common factor framework is carried out in order to allow for dependence among the units of the panel.
    Keywords: Panel cointegration, structural break, common factors, cross-section dependence
    JEL: C12 C22
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2006/5&r=ets
  7. By: Christophe Hurlin (LEO - Laboratoire d'économie d'Orleans - [CNRS : UMR6221] - [Université d'Orléans]); Sessi Tokpavi (LEO - Laboratoire d'économie d'Orleans - [CNRS : UMR6221] - [Université d'Orléans])
    Abstract: This paper proposes a new test of Value at Risk (VaR) validation. Our test exploits the idea that the sequence of VaR violations (Hit function) - taking value 1-α, if there is a violation, and -α otherwise - for a nominal coverage rate α verifies the properties of a martingale difference if the model used to quantify risk is adequate (Berkowitz et al., 2005). More precisely, we use the Multivariate Portmanteau statistic of Li and McLeod (1981) - extension to the multivariate framework of the test of Box and Pierce (1970) - to jointly test the absence of autocorrelation in the vector of Hit sequences for various coverage rates considered as relevant for the management of extreme risks. We show that this shift to a multivariate dimension appreciably improves the power properties of the VaR validation test for reasonable sample sizes.
    Keywords: Value-at-Risk; Risk Management; Model Selection
    Date: 2006–05–11
    URL: http://d.repec.org/n?u=RePEc:hal:papers:halshs-00068384_v1&r=ets
  8. By: Lucia Alessi; Matteo Barigozzi; Marco Capasso
    Abstract: We propose a new model for multivariate forecasting which combines the Generalized Dynamic Factor Model (GDFM)and the GARCH model. The GDFM, applied to a huge number of series, captures the multivariate information and disentangles the common and the idiosyncratic part of each series of returns. In this financial analysis, both these components are modeled as a GARCH. We compare GDFM+GARCH and standard GARCH performance on samples up to 475 series, predicting both levels and volatility of returns. While results on levels are not significantly different, on volatility the GDFM+GARCH model outperforms the standard GARCH in most cases. These results are robust with respect to different volatility proxies.
    Keywords: Dynamic Factors, GARCH, Volatility Forecasting
    Date: 2006–05–13
    URL: http://d.repec.org/n?u=RePEc:ssa:lemwps:2006/13&r=ets
  9. By: Don Harding; Adrian Pagan
    Abstract: Macroeconometric and Financial researchers often use secondary or constructed binary random variables that differ in terms of their statistical properties from the primary random variables used in microeconometric studies. One important di¤erence between primary and secondary binary variables is that while the former are, in many instances, independently distributed (i.d.) the later are rarely i.d. We show how popular rules for constructing binary states determine the degree and nature of the dependence in those states. When using constructed binary variables as regressands a common mistake is to ignore the dependence by using a probit model. We present an alternative non-parametric method that allows for dependence and apply that method to the issue of using the yield spread to predict recessions.
    Keywords: Business cycle;binary variable;Markov chain;probit model;yield curve
    JEL: C22 C53 E32 E37
    Date: 2006
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:963&r=ets

This nep-ets issue is ©2006 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.