
on Econometric Time Series 
By:  MOON, Hyungsik Roger; PERRON, Benoit 
Abstract:  Most panel unit root tests are designed to test the joint null hypothesis of a unit root for each individual series in a panel. After a rejection, it will often be of interest to identify which series can be deemed to be stationary and which series can be deemed nonstationary. Researchers will sometimes carry out this classification on the basis of n individual (univariate) unit root tests based on some ad hoc significance level. In this paper, we demonstrate how to use the false discovery rate (FDR) in evaluating I(1)=I(0) classifications based on individual unit root tests when the size of the cross section (n) and time series (T) dimensions are large. We report results from a simulation experiment and illustrate the methods on two data sets. 
Keywords:  False discovery rate, Multiple testing, unit root tests, panel data 
JEL:  C32 C33 C44 
Date:  2010 
URL:  http://d.repec.org/n?u=RePEc:mtl:montde:201004&r=ets 
By:  Søren Johansen (University of Copenhagen and CREATES); Morten Ørregaard Nielsen (Queen's University and CREATES) 
Abstract:  We discuss the moment condition for the fractional functional central limit theorem (FCLT) for partial sums of x_{t}=Delta^{d}u_{t}, where d in (1/2,1/2) is the fractional integration parameter and u_{t} is weakly dependent. The classical condition is existence of q>max(2,(d+1/2)^{1}) moments of the innovation sequence. When d is close to 1/2 this moment condition is very strong. Our main result is to show that under some relatively weak conditions on u_{t}, the existence of q≥max(2,(d+1/2)^{1}) is in fact necessary for the FCLT for fractionally integrated processes and that q>max(2,(d+1/2)^{1}) moments are necessary and sufficient for more general fractional processes. Davidson and de Jong (2000) presented a fractional FCLT where only q>2 finite moments are assumed, which is remarkable because it is the only FCLT where the moment condition has been weakened relative to the earlier condition. As a corollary to our main theorem we show that their moment condition is not sufficient. 
Keywords:  Fractional integration, functional central limit theorem, long memory, moment condition, necessary condition 
JEL:  C22 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1244&r=ets 
By:  Søren Johansen (Department of Economics, University of Copenhagen) 
Abstract:  There are simple wellknown conditions for the validity of regression and correlation as statistical tools. We analyse by examples the effect of nonstationarity on inference using these methods and compare them to model based inference. Finally we analyse some data on annual mean temperature and sea level, by applying the cointegrated vector autoregressive model, which explicitly takes into account the nonstationarity of the variables. 
Keywords:  regression correlation cointegration; model based inference; likelihood inference; annual mean temperature; sea level 
JEL:  C32 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:kud:kuiedp:1027&r=ets 
By:  Søren Johansen (Department of Economics, University of Copenhagen) 
Abstract:  This paper contains an overview of some recent results on the statistical analysis of cofractional processes, see Johansen and Nielsen (2010b). We first give an brief summary of the analysis of cointegration in the vector autoregressive model and then show how this can be extended to fractional processes. The model allows the process X_{t} to be fractional of order d and cofractional of order db≥0; that is, there exist vectors β for which β′X_{t} is fractional of order db. We analyse the Gaussian likelihood function to derive estimators and test statistics. The asymptotic properties are derived without the Gaussian assumption, under suitable moment conditions. We assume that the initial values are bounded and show that they do not influence the asymptotic analysis. The estimator of β is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. The asymptotic distribution of the likelihood ratio test for cointegration rank is a functional of fractional Brownian motion. 
Keywords:  cofractional processes; cointegration rank; fractional cointegration; likelihood inference; vector autoregressive model 
JEL:  C32 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:kud:kuiedp:1028&r=ets 
By:  André A. Monteiro 
Abstract:  This paper considers the problem of estimating a linear univariate Time Series State Space model for which the shape of the distribution of the observation noise is not specified a priori. Although somewhat challenging computationally, the simultaneous estimation of the parameters of the model and the unknown observation noise density is made feasible through a combination of Gaussiansum Filtering and Smoothing algorithms and Kernel Density Estimation methods. The bottleneck in these calculations consists in avoiding the geometric increase, with time, of the number of simultaneous Kalman filter components. It is the aim of this paper to show that this can be achieved by the use of standard techniques from Cluster Analysis and unsupervised Classification. An empirical illustration of this new methodology is included; this consists in the application of a semiparametric version of the Local Level model to the analysis of the wellknown river Nile data series. 
Keywords:  Clustering, GaussianSum, Kernel methods, Signal extraction, State space models 
JEL:  C13 C14 C22 
Date:  2010–09 
URL:  http://d.repec.org/n?u=RePEc:cte:wsrepe:ws103418&r=ets 
By:  Christophe Chorro (Centre d'Economie de la Sorbonne  Paris School of Economics); Dominique Guegan (Centre d'Economie de la Sorbonne  Paris School of Economics); Florian Ielpo (Pictet Asset Management and Centre d'Economie de la Sorbonne) 
Abstract:  This article discusses the finite distance properties of three likelihoodbased estimation strategies for GARCH processes with nonGaussian conditional distributions : (1) the maximum likelihood approach ; (2) the Quasi maximum Likelihood approach ; (3) a multisteps recursive estimation approach (REC). We first run a Monte Carlo test which shows that the recursive method may be the most relevant approach for estimation purposes. We then turn to a sample of SP500 returns. We confirm that the REC estimates are statistically dominating the parameters estimated by the two other competing methods. Regardless of the selected model, REC estimates deliver the more stable results. 
Keywords:  Maximum likelihood method, relatedGARCH process, recursive estimation method, mixture of Gaussian distributions, generalized hyperbolic distributions, SP500. 
JEL:  G13 C22 
Date:  2010–07 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:10067&r=ets 
By:  ERIC HILLEBRAND (DEPARTMENT OF ECONOMICS, LOUISIANA STATE UNIVERSITY,); MArcelo Cunha Medeiros (DEPARTMENT OF Economics, PUCrio Author mcm@econ.pucrio.br) 
Abstract:  We study the simultaneous occurrence of long memory and nonlinear effects, such as structural breaks and thresholds, in autoregressive moving average (ARMA) time series models and apply our modeling framework to series of daily realized volatility. Asymptotic theory for the quasimaximum likelihood estimator is developed and a sequence of model specification tests is described. Our framework allows for general nonlinear functions, including smoothly changing intercepts. The theoretical results in the paper can be applied to any series with long memory and nonlinearity. We apply the methodology to realized volatility of individual stocks of the Dow Jones Industrial Average during the period 1995 to 2005. We find strong evidence of nonlinear effects and explore different specifications of the model framework. A forecasting exercise demonstrates that allowing for nonlinearities in long memory models yields significant performance gains. 
Keywords:  Realized volatility, structural breaks, smooth transitions, nonlinear models, long memory, persistence. 
Date:  2010–10 
URL:  http://d.repec.org/n?u=RePEc:rio:texdis:578&r=ets 