nep-ets New Economics Papers
on Econometric Time Series
Issue of 2008‒07‒30
twelve papers chosen by
Yong Yin
SUNY at Buffalo

  1. Continuous-Time Models, Realized Volatilities, and Testable Distributional Implications for Daily Stock Returns By Torben G. Andersen; Tim Bollerslev; Per Frederiksen; Morten Ørregaard Nielsen
  2. Likelihood inference for a nonstationary fractional autoregressive model By Søren Johansen; Morten Ørregaard Nielsen
  3. A Powerful Test of the Autoregressive Unit Root Hypothesis Based on a Tuning Parameter Free Statistic By Morten Ørregaard Nielsen
  4. Nonparametric Cointegration Analysis of Fractional Systems With Unknown Integration Orders By Morten Ørregaard Nielsen
  5. Fully Modified Narrow-Band Least Squares Estimation of Stationary Fractional Cointegration By Morten Ørregaard Nielsen; Per Frederiksen
  6. The ‘Pre-Eminence of Theory’ versus the ‘General-to-Specific’ Cointegrated VAR Perspectives in Macro-Econometric Modeling By Spanos, Aris
  7. The Empirical Properties of Some Popular Estimators of Long Memory Processes By Jennifer Brown; Les Oxley; William Rea; Marco Reale
  8. Nelson-Plosser revisited: the ACF approach By Karim M. Abadir; Gabriel Talmain; Giovanni Caggiano
  9. The Finite-Sample E ects of VAR Dimensions on OLS Bias, OLS Variance, and Minimum MSE Estimators By Steve Lawford; Michalis P. Stamatogiannis
  10. Optimal Linear Filtering, Smoothing and Trend Extraction for Processes with Unit Roots and Cointegration By Dimitrios D. Thomakos
  11. Recurrent Support Vector Regression for a Nonlinear ARMA Model with Applications to Forecasting Financial Returns By Shiyi Chen; Kiho Jeong; Wolfgang K. Härdle
  12. Do we need time series econometrics? (Revised) By Rao, B. Bhaskara; Singh, Rup; Kumar, Saten

  1. By: Torben G. Andersen (Northwestern University, NBER, and CREATES); Tim Bollerslev (Duke University, NBER, and CREATES); Per Frederiksen (Nordea Markets); Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: We provide an empirical framework for assessing the distributional properties of daily speculative returns within the context of the continuous-time jump diffusion models traditionally used in asset pricing finance. Our approach builds directly on recently developed realized variation measures and non-parametric jump detection statistics constructed from high-frequency intraday data. A sequence of simple-to-implement moment-based tests involving various transformations of the daily returns speak directly to the importance of different distributional features, and may serve as useful diagnostic tools in the specification of empirically more realistic continuous-time asset pricing models. On applying the tests to the thirty individual stocks in the Dow Jones Industrial Average index, we find that it is important to allow for both time-varying diffusive volatility, jumps, and leverage effects to satisfactorily describe the daily stock price dynamics.
    Keywords: return distributions, continuous-time models, mixture-of-distributions hypothesis, financial-time sampling, high-frequency data, volatility signature plots, realized volatilities, jumps, leverage and volatility feedback effects
    JEL: C1 G1
    Date: 2008–07
  2. By: Søren Johansen (University of Copenhagen and CREATES); Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: This paper discusses model-based inference in an autoregressive model for fractional processes based on the Gaussian likelihood. The model has a verifiable criterion in terms of the roots of a polynomial for the process to be fractional of order d or d-b. Fractional differencing involves infinitely many past values and because we are interested in nonstationary processes we model the data X_{1},...,X_{T} given the initial values X_{-n}, n=0,1,..., as is usually done. The initial values are not modeled but assumed to be bounded. This represents a considerable generalization relative to all previous work where it is assumed that they are all zero. We consider the Gaussian likelihood and its derivatives as stochastic processes in the parameters which include d and b, and prove that they converge in distribution when the errors are i.i.d. with suitable moment conditions. We use this to prove existence and consistency of the maximum likelihood estimator, and to find the asymptotic distribution of the estimators and the likelihood ratio test of the associated fractional unit root hypothesis, which contains the fractional Brownian motion of type II.
    Keywords: Dickey-Fuller test, fractional unit root, likelihood inference
    JEL: C22
    Date: 2008–07
  3. By: Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: This paper presents a family of simple nonparametric unit root tests indexed by one parameter, d, and containing Breitung's (2002) test as the special case d=1. It is shown that (i) each member of the family with d>0 is consistent, (ii) the asymptotic distribution depends on d, and thus reflects the parameter chosen to implement the test, and (iii) since the asymptotic distribution depends on d and the test remains consistent for all d>0, it is possible to analyze the power of the test for different values of d. The usual Phillips-Perron or Dickey-Fuller type tests are indexed by bandwidth, lag length, etc., but have none of these three properties. It is shown that members of the family with d<1 have higher asymptotic local power than the Breitung (2002) test, and when d is small the asymptotic local power of the proposed nonparametric test is relatively close to the parametric power envelope, particularly in the case with a linear time-trend. Furthermore, GLS detrending is shown to improve power when d is small, which is not the case for Breitung's (2002) test. Simulations demonstrate that when applying a sieve bootstrap procedure, the proposed variance ratio test has very good size properties, with finite sample power that is higher than that of Breitung's (2002) test and even rivals the (nearly) optimal parametric GLS detrended augmented Dickey-Fuller test with lag length chosen by an information criterion.
    Keywords: augmented Dickey-Fuller test, fractional integration, GLS detrending, nonparametric, nuisance parameter, tuning parameter, power envelope, unit root test, variance ratio
    JEL: C22
    Date: 2008–07
  4. By: Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: In this paper a nonparametric variance ratio testing approach is proposed for determining the number of cointegrating relations in fractionally integrated systems. The test statistic is easily calculated without prior knowledge of the integration order of the data or of the strength of the cointegrating relations. Since the test is nonparametric, it does not require the specification of a particular model and is invariant to short-run dynamics. Nor does it require the choice of any lag length or bandwidth parameters which change the test statistic without being reflecting in the asymptotic distribution. Furthermore, a consistent estimate of the cointegration space can be obtained as part of the procedure. The asymptotic distribution theory for the proposed test is non-standard but easily tabulated. Monte Carlo simulations demonstrate excellent finite sample properties, even rivaling those of well-specified parametric tests. The proposed methodology is applied to the term structure of interest rates, and contrary to (fractional and integer-based) parametric approaches, evidence in favor of the expectations hypothesis is found using the nonparametric approach.
    Keywords: cointegration rank, cointegration space, fractional integration and cointegration, interest rates, long memory, nonparametric, term structure, variance ratio
    JEL: C32
    Date: 2008–07
  5. By: Morten Ørregaard Nielsen (Queen's University and CREATES); Per Frederiksen (Nordea Markets)
    Abstract: We consider estimation of the cointegrating relation in the stationary fractional cointegration model. This model has found important application recently, especially in financial economics. Previous research has considered a semiparametric narrow-band least squares (NBLS) estimator in the frequency domain, often under a condition of non-coherence between regressors and errors at the zero frequency. We show that in the absence of this condition, the NBLS estimator is asymptotically biased, and also that the bias can be consistently estimated. Consequently, we introduce a fully modified NBLS estimator which eliminates the bias while still having the same asymptotic variance as the NBLS estimator. We also show that local Whittle estimation of the integration order of the errors can be conducted consistently on the residuals from NBLS regression, whereas the estimator only has the same asymptotic distribution as if the errors were observed under the condition of non-coherence. Furthermore, compared to much previous research, the development of the asymptotic distribution theory is based on a different spectral density representation, which is relevant for multivariate fractionally integrated processes, and the use of this representation is shown to reduce both the asymptotic bias and variance of the narrow-band estimators. We also present simulation evidence and a series of empirical illustrations to demonstrate the feasibility and empirical relevance of our proposed methodology.
    Keywords: Fractional cointegration, frequency domain, fully modified estimation, long memory, semiparametric
    JEL: C22
    Date: 2008–07
  6. By: Spanos, Aris
    Abstract: The primary aim of the paper is to place current methodological discussions on empirical modeling contrasting the ‘theory first’ versus the ‘data first’ perspectives in the context of a broader methodological framework with a view to constructively appraise them. In particular, the paper focuses on Colander’s argument in his paper “Economists, Incentives, Judgement and Empirical Work” relating to the two different perspectives in Europe and the US that are currently dominating empirical macro-econometric modeling and delves deeper into their methodological/philosophical foundations. It is argued that the key to establishing a constructive dialogue between them is provided by a better understanding of the role of data in modern statistical inference, and how that relates to the centuries old issue of the realisticness of economic theories.
    Keywords: Econometric methodology, ‘general-to-specific’, pre-eminence of theory, VAR, statistical adequacy, realisticness of theory, statistical model
    JEL: B4 C1 C3
    Date: 2008
  7. By: Jennifer Brown; Les Oxley (University of Canterbury); William Rea; Marco Reale
    Abstract: We present the results of a simulation study into the properties of 12 different estimators of the Hurst parameter, H, or the fractional integration parameter, d, in long memory time series. We compare and contrast their performance on simulated Fractional Gaussian Noises and fractionally integrated series with lengths between 100 and 10,000 data points and H values between 0.55 and 0.90 or d values between 0.05 and 0.40. We apply all 12 estimators to the Campito Mountain data and estimate the accuracy of their estimates using the Beran goodness of t test for long memory time series.
    Keywords: Strong dependence; global dependence; long range dependence; Hurst parameter estimators
    JEL: C13 C22
    Date: 2008–06–26
  8. By: Karim M. Abadir (Imperial College London, London, UK and The Rimini Centre for Economic Analysis, Italy); Gabriel Talmain (University of Glasgow, Glasgow, UK); Giovanni Caggiano (University of Padua, Italy)
    Abstract: We detect a new stylized fact about the common dynamics of macroeconomic and financial aggregates. The rate of decay of the memory of these series is depicted by their Auto-Correlation Functions (ACFs). They all share a common four-parameter functional form that we derive from the dynamics of an RBC model with heterogeneous firms. We find that, not only does our formula fit the data better than the ACFs that arise from autoregressive models, but it also yields the correct shape of the ACF. This can help policymakers understand better the lags with which an economy evolves, and the onset of its turning points. Classification-JEL: JEL E32, E52, E63
    Date: 2008–01
  9. By: Steve Lawford (ENAC, France, University of Nottingham, UK, Philips College, Cyprus and The Rimini The Rimini Center for Economic Analysis, Italy); Michalis P. Stamatogiannis
    Abstract: Vector autoregressions (VARs) are important tools in time series analysis. However, relatively little is known about the nite-sample behaviour of parameter estimators. We address this issue, by investigating ordinary least squares (OLS) estimators given a data generating process that is a purely nonstationary rst-order VAR. Speci cally, we use Monte Carlo simulation and numerical optimization to derive response surfaces for OLS bias and variance, in terms of VAR dimensions, given correct speci cation and several types of over-parameterization of the model: we include a constant, and a constant and trend, and introduce excess lags. We then examine the correction factors that are required for the least squares estimator to attain minimum mean squared error (MSE). Our results improve and extend one of the main nite-sample multivariate analytical bias results of Abadir, Hadri and Tzavalis (Econometrica 67 (1999) 163), generalize the univariate variance and MSE ndings of Abadir (Economics Letters 47 (1995) 263) to the multivariate setting, and complement various asymptotic studies.
    Keywords: Finite-sample bias, Monte Carlo simulation, nonstationary time series, response surfaces, vector autoregression.
    JEL: C15 C22 C32
    Date: 2008–01
  10. By: Dimitrios D. Thomakos (University of Peloponnese, Greece and The Rimini Centre for Economic Analysis)
    Abstract: In this paper I propose a novel optimal linear ølter for smoothing, trend and signal extraction for time series with a unit root. The filter is based on the Singular Spectrum Analysis (SSA) methodology, takes the form of a particular moving average and is di¨erent from other linear filters that have been used in the existing literature. To best of my knowledge this is the first time that moving average smoothing is given an optimality justification for use with unit root processes. The frequency response function of the filter is examined and a new method for selecting the degree of smoothing is suggested. I also show that the filter can be used for successfully extracting a unit root signal from stationary noise. The proposed methodology can be extended to also deal with two cointegrated series and I show how to estimate the cointegrating coe±cient using SSA and how to extract the common stochastic trend component. A simulation study explores some of the characteristics of the filter for signal extraction, trend prediction and cointegration estimation for univariate and bivariate series. The practical usefulness of the method is illustrated using data for the US real GDP and two financial time series. Classification-JEL:
    Keywords: cointegration, forecasting, linear øltering, singular spectrum analysis, smoothing, trend extraction and prediction, unit root.
    Date: 2008–01
  11. By: Shiyi Chen; Kiho Jeong; Wolfgang K. Härdle
    Abstract: Motivated by the recurrent Neural Networks, this paper proposes a recurrent Support Vector Regression (SVR) procedure to forecast nonlinear ARMA model based simulated data and real data of financial returns. The forecasting ability of the recurrent SVR is compared with three competing methods, MLE, recurrent MLP and feedforward SVR. Theoretically, MLE and MLP only focus on fit in-sample, but SVR considers both fit and forecast out-of-sample which endows SVR with an excellent forecasting ability. This is confirmed by the evidence from the simulated and real data based on two forecasting accuracy evaluation metrics (NSME and sign). That is, for one-step-ahead forecasting, the recurrent SVR is consistently better than the MLE and the recurrent MLP in forecasting both the magnitude and turning points, and really improves the forecasting performance as opposed to the usual feedforward SVR.
    Keywords: Recurrent Support Vector Regression; MLE; recurrent MLP; nonlinear ARMA; financial forecasting
    JEL: C45 F37 F47
    Date: 2008–07
  12. By: Rao, B. Bhaskara; Singh, Rup; Kumar, Saten
    Abstract: It is argued that whether or not there is a need for unit roots and cointegration based econometric methods is a methodological issue. An alternative is the econometrics of the London School of Economics (LSE) and Hendry approach based on the simpler classical methods of estimation. This is known as the general to specific method (GETS). Like all other methodological issues this is also difficult to resolve but we think that GETS is very useful.
    Keywords: GETS; Cointegration; Box-Jenkin’s Equations; Hendry; Granger
    JEL: C51 C13 A19 B41
    Date: 2007–11–19

This nep-ets issue is ©2008 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.