
on Econometric Time Series 
By:  Violetta Dalla; Liudas Giraitis; Javier Hidalgo 
Abstract:  For linear processes, semiparametric estimation of the memory parameter, based on the logperiodogramand local Whittle estimators, has been exhaustively examined and their properties are well established.However, except for some specific cases, little is known about the estimation of the memory parameter fornonlinear processes. The purpose of this paper is to provide general conditions under which the localWhittle estimator of the memory parameter of a stationary process is consistent and to examine its rate ofconvergence. We show that these conditions are satisfied for linear processes and a wide class of nonlinearmodels, among others, signal plus noise processes, nonlinear transforms of a Gaussian process ?tandEGARCH models. Special cases where the estimator satisfies the central limit theorem are discussed. Thefinite sample performance of the estimator is investigated in a small MonteCarlo study. 
Keywords:  Long memory, semiparametric estimation, local Whittle estimator. 
JEL:  C14 C22 
Date:  2006–01 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/497&r=ets 
By:  Javier Hualde; Peter M Robinson 
Abstract:  Empirical evidence has emerged of the possibility of fractional cointegration such that thegap, ß, between the integration order d of observable time series, and the integrationorder ? of cointegrating errors, is less than 0.5. This includes circumstances whenobservables are stationary or asymptotically stationary with long memory (so d < 1/2),and when they are nonstationary (so d 1/2). This "weak cointegration" contrastsstrongly with the traditional econometric prescription of unit root observables and shortmemory cointegrating errors, where ß = 1. Asymptotic inferential theory also differs fromthis case, and from other members of the class ß > 1/2, in particular=consistent  n andasymptotically normal estimation of the cointegrating vector ? is possible when ß < 1/2,as we explore in a simple bivariate model. The estimate depends on ? and d or, morerealistically, on estimates of unknown ? and d. These latter estimates need to beconsistent  n , and the asymptotic distribution of the estimate of ? is sensitive to theirprecise form. We propose estimates of ? and d that are computationally relativelyconvenient, relying on only univariate nonlinear optimization. Finite sample performanceof the methods is examined by means of Monte Carlo simulations, and severalapplications to empirical data included. 
Keywords:  Fractional cointegration, Parametric estimation, Asymptotic normality. 
JEL:  C32 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/499&r=ets 
By:  M. Gerolimetto; Peter M Robinson 
Abstract:  Instrumental variables estimation is classically employed to avoid simultaneousequations bias in a stable environment. Here we use it to improve upon ordinaryleast squares estimation of cointegrating regressions between nonstationaryand/or long memory stationary variables where the integration orders of regressorand disturbance sum to less than 1, as happens always for stationary regressors,and sometimes for meanreverting nonstationary ones. Unlike in the classicalsituation, instruments can be correlated with disturbances and/or uncorrelated withregressors. The approach can also be used in traditional nonfractionalcointegrating relations. Various choices of instrument are proposed. Finite sampleperformance is examined. 
Keywords:  Cointegration, Instrumental variables estimation, I(d) processes. 
JEL:  C32 
Date:  2006–04 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/500&r=ets 
By:  Afonso Gonçalves da Silva; Peter M Robinson 
Abstract:  Nonlinear functions of multivariate financial time series can exhibit longmemory and fractional cointegration. However, tools for analysingthese phenomena have principally been justified under assumptionsthat are invalid in this setting. Determination of asymptotic theoryunder more plausible assumptions can be complicated and lengthy.We discuss these issues and present a Monte Carlo study, showingthat asymptotic theory should not necessarily be expected to provide agood approximation to finitesample behaviour. 
Keywords:  Fractional cointegration, memory estimation,stochastic volatility. 
JEL:  C32 
Date:  2006–04 
URL:  http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/501&r=ets 
By:  Markku Lanne 
Abstract:  A multiplicative error model with timevarying parameters and an error term following a mixture of gamma distributions is introduced. The model is fitted to the daily realized volatility series of Deutschemark/Dollar and Yen/Dollar returns and is shown to capture the conditional distribution of these variables better than the commonly used ARFIMA model. The forecasting performance of the new model is found to be, in general, superior to that of the set of volatility models recently considered by Andersen et al. (2003) for the same data. 
Keywords:  Mixture model, Realized volatility, Gamma distribution 
JEL:  C22 C52 C53 G15 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2006/3&r=ets 
By:  Anindya Banerjee; Josep Lluís CarrioniSilvestre 
Abstract:  The power of standard panel cointegration statistics may be affected by misspecification errors if proper account is not taken of the presence of structural breaks in the data. We propose modifications to allow for one structural break when testing the null hypothesis of no cointegration that retain good properties in terms of empirical size and power. Response surfaces to approximate the finite sample moments that are required to implement the statistics are provided. Since panel cointegration statistics rely on the assumption of crosssection independence, a generalisation of the tests to the common factor framework is carried out in order to allow for dependence among the units of the panel. 
Keywords:  Panel cointegration, structural break, common factors, crosssection dependence 
JEL:  C12 C22 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2006/5&r=ets 
By:  Christophe Hurlin (LEO  Laboratoire d'économie d'Orleans  [CNRS : UMR6221]  [Université d'Orléans]); Sessi Tokpavi (LEO  Laboratoire d'économie d'Orleans  [CNRS : UMR6221]  [Université d'Orléans]) 
Abstract:  This paper proposes a new test of Value at Risk (VaR) validation. Our test exploits the idea that the sequence of VaR violations (Hit function)  taking value 1α, if there is a violation, and α otherwise  for a nominal coverage rate α verifies the properties of a martingale difference if the model used to quantify risk is adequate (Berkowitz et al., 2005). More precisely, we use the Multivariate Portmanteau statistic of Li and McLeod (1981)  extension to the multivariate framework of the test of Box and Pierce (1970)  to jointly test the absence of autocorrelation in the vector of Hit sequences for various coverage rates considered as relevant for the management of extreme risks. We show that this shift to a multivariate dimension appreciably improves the power properties of the VaR validation test for reasonable sample sizes. 
Keywords:  ValueatRisk; Risk Management; Model Selection 
Date:  2006–05–11 
URL:  http://d.repec.org/n?u=RePEc:hal:papers:halshs00068384_v1&r=ets 
By:  Lucia Alessi; Matteo Barigozzi; Marco Capasso 
Abstract:  We propose a new model for multivariate forecasting which combines the Generalized Dynamic Factor Model (GDFM)and the GARCH model. The GDFM, applied to a huge number of series, captures the multivariate information and disentangles the common and the idiosyncratic part of each series of returns. In this financial analysis, both these components are modeled as a GARCH. We compare GDFM+GARCH and standard GARCH performance on samples up to 475 series, predicting both levels and volatility of returns. While results on levels are not significantly different, on volatility the GDFM+GARCH model outperforms the standard GARCH in most cases. These results are robust with respect to different volatility proxies. 
Keywords:  Dynamic Factors, GARCH, Volatility Forecasting 
Date:  2006–05–13 
URL:  http://d.repec.org/n?u=RePEc:ssa:lemwps:2006/13&r=ets 
By:  Don Harding; Adrian Pagan 
Abstract:  Macroeconometric and Financial researchers often use secondary or constructed binary random variables that differ in terms of their statistical properties from the primary random variables used in microeconometric studies. One important di¤erence between primary and secondary binary variables is that while the former are, in many instances, independently distributed (i.d.) the later are rarely i.d. We show how popular rules for constructing binary states determine the degree and nature of the dependence in those states. When using constructed binary variables as regressands a common mistake is to ignore the dependence by using a probit model. We present an alternative nonparametric method that allows for dependence and apply that method to the issue of using the yield spread to predict recessions. 
Keywords:  Business cycle;binary variable;Markov chain;probit model;yield curve 
JEL:  C22 C53 E32 E37 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:mlb:wpaper:963&r=ets 