
on Econometric Time Series 
By:  Mario Forni (Università di Modena e Reggio Emilia, CEPR and RECent); Marc Hallin (ECARES, Université Libre de Bruxelles and ORFE, Princeton University); Marco Lippi (Università di Roma "La Sapienza" and EIEF); Paolo Zaffaroni (Imperial College London and Università di Roma "La Sapienza") 
Abstract:  Factor model methods recently have become extremely popular in the theory and practice of large panels of time series data. Those methods rely on various factor models which all are particular cases of the Generalized Dynamic Factor Model (GDFM) introduced in Forni, Hallin, Lippi and Reichlin (2000). In that paper, however, estimation relies on Brillinger’s concept of dynamic principal components, which produces filters that are in general twosided and therefore yield poor performances at the end of the observation period and hardly can be used for forecasting purposes. In the present paper, we remedy this problem, and show how, based on recent results on singular stationary processes with rational spectra, onesided estimators are possible for the parameters and the common shocks in the GDFM. Consistency is obtained, along with rates. An empirical section, based on US macroeconomic time series, compares estimates based on our model with those based on the usual staticrepresentation restriction, and provide convincing evidence that the assumptions underlying the latter are not supported by the data. 
Keywords:  Generalized dynamic factor models. Vector processes with singular spectral density. Onesided representations for dynamic factor models. consistency and rates for estimators of dynamic factor models. 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:sas:wpaper:20115&r=ets 
By:  Klein, Ingo; Tinkl, Fabian 
Abstract:  Zhang (2008) defines the quotient correlation coefficient to test for dependence and tail dependence of bivariate random samples. He shows that asymptotically the test statistics are gamma distributed. Therefore, he called the corresponding test gamma test. We want to investigate the speed of convergence by a simulation study. Zhang discusses a rankbased version of this gamma test that depends on random numbers drawn from a standard Frechet distribution. We propose an alternative that does not depend on random numbers. We compare the size and the power of this alternative with the wellknown ttest, the van der Waerden and the Spearman rank test. Zhang proposes his gamma test also for situations where the dependence is neither strictly increasing nor strictly decreasing. In contrast to this, we show that the quotient correlation coefficient can only measure monotone patterns of dependence.  
Keywords:  test on dependence,rank correlation test,Spearman's p,copula,Lehmann ordering 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:zbw:faucse:872010&r=ets 
By:  Xiangjin Shen (Rutgers University); Hiroki Tsurumi (Rutgers University) 
Abstract:  We compare Bayesian and sample theory model specification criteria. For the Bayesian criteria we use the deviance information criterion and the cumulative density of the mean squared errors of forecast. For the sample theory criterion we use the conditional Kolmogorov test. We use Markov chain Monte Carlo methods to obtain the Bayesian criteria and bootstrap sampling to obtain the conditional Kolmogorov test. Two nonnested models we consider are the CIR and Vasicek models for spot asset prices. Monte Carlo experiments show that the DIC performs better than the cumulative density of the mean squared errors of forecast and the CKT. According to the DIC and the mean squared errors of forecast, the CIR model explains the daily data on uncollateralized Japanese call rate from January 1 1990 to April 18 1996; but according to the CKT, neither the CIR nor Vasicek models explains the daily data. 
Keywords:  Deviance information criterion, Markov chain Monte Carlo algorithms, Block bootstrap, Conditional Kolmogorov test, CIR and Vasicek models 
JEL:  C1 C5 G0 
Date:  2011–06–07 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:201126&r=ets 
By:  Cristina FuentesAlbero (Rutgers, The State University of New Jersey); Leonardo Melosi (London Business School) 
Abstract:  We introduce two new methods for estimating the Marginal Data Density (MDD) from the Gibbs output, which are based on exploiting the analytical tractability condition. Such a condition requires that some parameter blocks can be analytically integrated out from the conditional posterior densities. Our estimators are applicable to densely parameterized time series models such as VARs or DFMs. An empirical application to sixvariate VAR models shows that the bias of a fully computational estimator is sufficiently large to distort the implied model rankings. One estimator is fast enough to make multiple computations of MDDs in densely parameterized models feasible. 
Keywords:  Marginal likelihood, Gibbs sampler, time series econometrics, Bayesian econometrics, reciprocal importance sampling 
JEL:  C11 C15 C16 
Date:  2011–10–17 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:201131&r=ets 
By:  Norman R. Swanson (Rutgers University); Nii Ayi Armah (Bank of Canada) 
Abstract:  In economics, common factors are often assumed to underlie the comovements of a set of macroeconomic variables. For this reason, many authors have used estimated factors in the construction of prediction models. In this paper, we begin by surveying the extant literature on diffusion indexes. We then outline a number of approaches to the selection of factor proxies (observed variables that proxy unobserved estimated factors) using the statistics developed in Bai and Ng (2006a,b). Our approach to factor proxy selection is examined via a small Monte Carlo experiment, where evidence supporting our proposed methodology is presented, and via a large set of prediction experiments using the panel dataset of Stock and Watson (2005). One of our main empirical findings is that our “smoothed” approaches to factor proxy selection appear to yield predictions that are often superior not only to a benchmark factor model, but also to simple linear time series models which are generally difficult to beat in forecasting competitions. In some sense, by using our approach to predictive factor proxy selection, one is able to open up the “black box” often associated with factor analysis, and to identify actual variables that can serve as primitive building blocks for (prediction) models of a host of macroeconomic variables, and that can also serve are policy instruments, for example. Our findings suggest that important observable variables include: various S&P500 variables, including stock price indices and dividend series; a 1year Treasury bond rate; various housing activity variables; industrial production; and exchange rates. 
Keywords:  diffusion index; factor, forecast, macroeconometrics, parameter estimation error, proxy 
JEL:  C22 
Date:  2011–05–14 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:201105&r=ets 
By:  Norman R. Swanson (Rutgers University); Valentina Corradi (University of Warwick); Walter Distaso (Queen Mary) 
Abstract:  In recent years, numerous volatilitybased derivative products have been engineered. This has led to interest in constructing conditional predictive densities and confidence intervals for integrated volatility. In this paper, we propose nonparametric kernel estimators of the aforementioned quantities. The kernel functions used in our analysis are based on different realized volatility measures, which are constructed using the ex post variation of asset prices. A set of sufficient conditions under which the estimators are asymptotically equivalent to their unfeasible counterparts, based on the unobservable volatility process, is provided. Asymptotic normality is also established. The efficacy of the estimators is examined via Monte Carlo experimentation, and an empirical illustration based upon data from the New York Stock Exchange is provided. 
Keywords:  Diffusions, integrated volatility, realized volatility measures, kernels, microstructure noise 
JEL:  C22 C53 C14 
Date:  2011–05–14 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:201108&r=ets 
By:  Norman R. Swanson (Rutgers University); Lili Cai (Shanghai Jiao Tong University) 
Abstract:  We review and construct consistent insample specification and outofsample model selection tests on conditional distributions and predictive densities associated with continuous multifactor (possibly with jumps) and (non)linear discrete models of the short term interest rate. The results of our empirical analysis are used to carry out a “horserace” comparing discrete and continuous models across multiple sample periods, forecast horizons, and evaluation intervals. Our evaluation involves comparing models during two distinct historical periods, as well as across our entire weekly sample of Eurodollar deposit rates from 19822008. Interestingly, when our entire sample of data is used to estimate competing models, the “best” performer in terms of distributional “fit” as well as predictive density accuracy, both insample and outofsample, is the three factor Chen (CHEN: 1996) model examined by Andersen, Benzoni and Lund (2004). Just as interestingly, a logistic type discrete smooth transition autoregression (STAR) model is preferred to the “best” continuous model (i.e. the one factor Cox, Ingersoll, and Ross (CIR: 1985) model) when comparing predictive accuracy for the “Stable 1990s” period that we examine. Moreover, an analogous result holds for the “Post 1990s” period that we examine, where the STAR model is preferred to a two factor stochastic mean model. Thus, when the STAR model is parameterized using only data corresponding to a particular subsample, it outperforms the “best” continuous alternative during that period. However, when models are estimated using the entire dataset, the continuous CHEN model is preferred, regardless of the variety of model specification (selection) test that is carried out. Given that it is very difficult to ascertain the particular future regime that will ensue when constructing ex ante predictions, thus, the CHEN model is our overall “winning” model, regardless of sample period. 
Keywords:  interest rate, multifactor diffusion process, specification test,, outofsample forecasts, block bootstrap 
JEL:  C1 C5 G0 
Date:  2011–05–13 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:201102&r=ets 
By:  Norman R. Swanson (Rutgers University); Valentina Corradi (University of Warwick) 
Abstract:  This paper develops tests for comparing the accuracy of predictive densities derived from (possibly misspecified) diffusion models. In particular, we first outline a simple simulationbased framework for constructing predictive densities for onefactor and stochastic volatility models. Then, we construct accuracy assessment tests that are in the spirit of Diebold and Mariano (1995) and White (2000). In order to establish the asymptotic properties of our tests, we also develop a recursive variant of the nonparametric simulated maximum likelihood estimator of Fermanian and Salani´e (2004). In an empirical illustration, the predictive densities from several models of the onemonth federal funds rates are compared. 
Keywords:  block bootstrap, diffusion processes, jumps, nonparametric simulated quasi maximum likelihood, parameter estimation error 
JEL:  C22 C51 
Date:  2011–05–15 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:201112&r=ets 
By:  Søren Johansen (Department of Economics, University of Copenhagen and CREATES, School of Economics and Management, Aarhus University); Bent Nielsen (Department of Economics, University of Oxford) 
Abstract:  Iterated onestep Huberskip Mestimators are considered for regression problems. Each onestep estimator is a reweighted least squares estimators with zero/one weights determined by the initial estimator and the data. The asymptotic theory is given for iteration of such estimators using a tightness argument. The results apply to stationary as well as nonstationary regression problems. 
Keywords:  Huberskip; iteration; onestep Mestimators; unit roots 
JEL:  C32 
Date:  2011–11–16 
URL:  http://d.repec.org/n?u=RePEc:kud:kuiedp:1129&r=ets 
By:  Markus Bibinger; Markus Reiß 
Abstract:  We propose localized spectral estimators for the quadratic covariation and the spot covolatility of diffusion processes which are observed discretely with additive observation noise. The eligibility of this approach to lead to an appropriate estimation for timevarying volatilities stems from an asymptotic equivalence of the underlying statistical model to a white noise model with correlation and volatility processes being constant over small intervals. The asymptotic equivalence of the continuoustime and the discretetime experiments are proved by a construction with linear interpolation in one direction and local means for the other. The new estimator outperforms earlier nonparametric approaches in the considered model. We investigate its finite sample size characteristics in simulations and draw a comparison between the various proposed methods. 
Keywords:  asymptotic equivalence, covariation, integrated covolatility, microstructure noise, spectral adaptive estimation 
JEL:  C14 C32 C58 G10 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2011085&r=ets 
By:  Ola L{\o}vsletten; Martin Rypdal 
Abstract:  We present an approximated maximum likelihood method for the multifractal random walk processes of [E. Bacry et al., Phys. Rev. E 64, 026103 (2001)]. The likelihood is computed using a Laplace approximation and a truncation in the dependency structure for the latent volatility. The procedure is implemented as a package in the R computer language. Its performance is tested on synthetic data and compared to an inference approach based on the generalized method of moments. The method is applied to estimate parameters for various financial stock indices. 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1112.0105&r=ets 
By:  Vladimir Soloviev; Vladimir Saptsin; Dmitry Chabanenko 
Abstract:  In this research the technology of complex Markov chains is applied to predict financial time series. The main distinction of complex or highorder Markov Chains and simple firstorder ones is the existing of aftereffect or memory. The technology proposes prediction with the hierarchy of time discretization intervals and splicing procedure for the prediction results at the different frequency levels to the single prediction output time series. The hierarchy of time discretizations gives a possibility to use fractal properties of the given time series to make prediction on the different frequencies of the series. The prediction results for world's stock market indices is presented. 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1111.5254&r=ets 
By:  ChangShuai Li 
Abstract:  This paper demonstrates the flaws of copersistence theory proposed by Bollerslev and Engle (1993) which cause the theory can hardly be applied. With the introduction of the halflife of decay coefficient as the measure of the persistence, and both the weak definition of persistence and copersistence in variance, this study attempts to solve the problems by using exhaustive search algorithm for obtaining copersistent vector. In addition, this method is illustrated to research the copersistence of stock return volatility in 10 European countries. 
Date:  2011–12 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1112.1363&r=ets 