nep-ets New Economics Papers
on Econometric Time Series
Issue of 2006‒10‒28
eighteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Semiparametric Estimation of Fractional Cointegration By Javier Hualde; Peter M Robinson
  2. Nonparametric Transformation to White Noise By Oliver Linton; Enno Mammen
  3. TESTING FOR STOCHASTICMONOTONICITY By Sokbae Lee; Oliver Linton; Yoon-Jae Whang
  4. Conditional-Sum-of-Squares Estimation ofModels for Stationary Time Series with Long Memory By Peter M Robinson
  5. Semiparametric Estimation of aCharacteristic-based Factor Model ofCommon Stock Returns By Gregory Connor; Oliver Linton
  6. Estimating Quadratic VariationConsistently in thePresence of Correlated MeasurementError By Ilze Kalnina; Oliver Linton
  7. Optimal Instruments in Time Series: A Survey By Stanislav Anatolyev
  8. Nonparametric retrospection and monitoring of predictability of financial returns By Stanislav Anatolyev
  9. A Quasi Maximum Likelihood Approach for Large Approximate Dynamic Factor Models By Doz, Catherine; Giannone, Domenico; Reichlin, Lucrezia
  10. Does Information Help Recovering Structural Shocks from Past Observations? By Giannone, Domenico; Reichlin, Lucrezia
  11. Drift and Breaks in Labour Productivity By Benati, Luca
  12. Forecasting Using a Large Number of Predictors: Is Bayesian Regression a Valid Alternative to Principal Components? By De Mol, Christine; Giannone, Domenico; Reichlin, Lucrezia
  13. MODELLING LONG-MEMORY VOLATILITIES WITH LEVERAGE EFFECT: ALMSV VERSUS FIEGARCH By Esther Ruiz; Helena Veiga
  14. Adaptive Estimation of Autoregressive Models with Time-Varying Variances By Ke-Li Xu; Peter C.B. Phillips
  15. A Complete Asymptotic Series for the Autocovariance Function of a Long Memory Process By Offer Lieberman; Peter C.B. Phillips
  16. Log Periodogram Regression: The Nonstationary Case By Chang Sik Kim; Peter C.B. Phillips
  17. Skewness Premium with Lévy Processes By José Fajardo; Ernesto Mordecki
  18. An Alternative Trend-Cycle Decomposition using a State Space Model with Mixtures of Normals: Specifications and Applications to International Data By Tatsuma Wada; Pierre Perron

  1. By: Javier Hualde; Peter M Robinson
    Abstract: A semiparametric bivariate fractionally cointegrated system is considered, integrationorders possibly being unknown and I (0) unobservable inputs having nonparametricspectral density. Two kinds of estimate of the cointegrating parameter ? are considered,one involving inverse spectral weighting and the other, unweighted statistics with a spectralestimate at frequency zero. We establish under quite general conditions the asymptoticdistributional properties of the estimates of ?, both in case of "strong cointegration" (whenthe difference between integration orders of observables and cointegrating errors exceeds1/2) and in case of "weak cointegration" (when that difference is less than 1/2), whichincludes the case of (asymptotically) stationary observables. Across both cases, the sameWald test statistic has the same standard null ?2 limit distribution, irrespective of whetherintegration orders are known or estimated. The regularity conditions include unprimitiveones on the integration orders and spectral density estimates, but we check these undermore primitive conditions on particular estimates. Finite-sample properties are examined ina Monte Carlo study.
    Keywords: Fractional cointegration, semiparametric model, unknown integration orders.
    JEL: C32
    Date: 2006–05
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/502&r=ets
  2. By: Oliver Linton; Enno Mammen
    Abstract: We consider a semiparametric distributed lag model in which the "news impact curve" m isnonparametric but the response is dynamic through some linear filters. A special case ofthis is a nonparametric regression with serially correlated errors. We propose an estimatorof the news impact curve based on a dynamic transformation that produces white noiseerrors. This yields an estimating equation for m that is a type two linear integral equation.We investigate both the stationary case and the case where the error has a unit root. In thestationary case we establish the pointwise asymptotic normality. In the special case of anonparametric regression subject to time series errors our estimator achieves efficiencyimprovements over the usual estimators, see Xiao, Linton, Carroll, and Mammen (2003). Inthe unit root case our procedure is consistent and asymptotically normal unlike the standardregression smoother. We also present the distribution theory for the parameter estimates,which is non-standard in the unit root case. We also investigate its finite sampleperformance through simulation experiments.
    Keywords: Efficiency, Inverse Problem, Kernel Estimation, Nonparametric regression,Time Series, Unit Roots.
    JEL: C14
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/503&r=ets
  3. By: Sokbae Lee; Oliver Linton; Yoon-Jae Whang
    Abstract: We propose a test of the hypothesis of stochastic monotonicity. This hypothesis isof interest in many applications. Our test is based on the supremum of a rescaledU-statistic. We show that its asymptotic distribution is Gumbel. The proof is difficultbecause the approximating Gaussian stochastic process contains both a stationaryand a nonstationary part and so we have to extend existing results that only applyto either one or the other case.
    Keywords: Distribution function, Extreme Value Theory, Gaussian Process,Monotonicity.
    JEL: C14 C15
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/504&r=ets
  4. By: Peter M Robinson
    Abstract: Employing recent results of Robinson (2005) we consider the asymptotic properties ofconditional-sum-of-squares (CSS) estimates of parametric models for stationary timeseries with long memory. CSS estimation has been considered as a rival to Gaussianmaximum likelihood and Whittle estimation of time series models. The latter kinds ofestimate have been rigorously shown to be asymptotically normally distributed in case oflong memory. However, CSS estimates, which should have the same asymptoticdistributional properties under similar conditions, have not received comparabletreatment: the truncation of the infinite autoregressive representation inherent in CSSestimation has been essentially ignored in proofs of asymptotic normality. Unlike in shortmemory models it is not straightforward to show the truncation has negligible effect.
    Keywords: Long memory, conditional-sum-of-squares estimation,central limit theorem, almost sure convergence.
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/505&r=ets
  5. By: Gregory Connor; Oliver Linton
    Abstract: We introduce an alternative version of the Fama-French three-factor model of stockreturns together with a new estimation methodology. We assume that the factorbetas in the model are smooth nonlinear functions of observed securitycharacteristics. We develop an estimation procedure that combines nonparametrickernel methods for constructing mimicking portfolios with parametric nonlinearregression to estimate factor returns and factor betas simultaneously. Themethodology is applied to US common stocks and the empirical findings comparedto those of Fama and French.
    Keywords: characteristic-based factor model, arbitrage pricing theory, kernelestimation, nonparametric estimation.
    JEL: G12 C14
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/506&r=ets
  6. By: Ilze Kalnina; Oliver Linton
    Abstract: We propose an econometric model that captures the e¤ects of marketmicrostructure on a latent price process. In particular, we allow for correlationbetween the measurement error and the return process and we allow themeasurement error process to have a diurnal heteroskedasticity. Wepropose a modification of the TSRV estimator of quadratic variation. Weshow that this estimator is consistent, with a rate of convergence thatdepends on the size of the measurement error, but is no worse than n1=6.We investigate in simulation experiments the finite sample performance ofvarious proposed implementations.
    Keywords: Endogenous noise, Market Microstructure, Realised Volatility,Semimartingale
    JEL: C12
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cep:stiecm:/2006/509&r=ets
  7. By: Stanislav Anatolyev (NES)
    Abstract: This article surveys estimation in stationary time series models using the approach of optimal instrumentation. We review tools that allow construction and implementation of optimal instrumental variables estimators in various circumstances { in single- and multiperiod models, in the absence and presence of conditional heteroskedasticity, by considering linear and nonlinear instruments. We also discuss issues adjacent to the theme of optimal instruments. The article is directed primarily towards practitioners, but also may be found useful by econometric theorists and teachers of graduate econometrics.
    Keywords: Instrumental variables estimation; Moment restrictions; Optimal instrument; Effciency bounds; Stationary time series.
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0069&r=ets
  8. By: Stanislav Anatolyev (NES)
    Abstract: We develop and evaluate sequential testing tools for a class of nonparametric tests for predictability of financial returns that includes, in particular, the directional accuracy and excess profitability tests. We consider both the retrospective context where a researcher wants to track predictability over time in a historical sample, and the monitoring context where a researcher conducts testing as new observations arrive. Throughout, we elaborate on both two-sided and one-sided testing, focusing on linear monitoring boundaries that are continuations of horizontal lines corresponding to retrospective critical values. We illustrate our methodology by testing for directional and mean predictability of returns in a dozen of young stock markets in Eastern Europe.
    Keywords: Testing, monitoring, predictability, stock returns
    JEL: C12 C22 C52 C53
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0071&r=ets
  9. By: Doz, Catherine; Giannone, Domenico; Reichlin, Lucrezia
    Abstract: This paper considers quasi-maximum likelihood estimations of a dynamic approximate factor model when the panel of time series is large. Maximum likelihood is analyzed under different sources of misspecification: omitted serial correlation of the observations and cross-sectional correlation of the idiosyncratic components. It is shown that the effects of misspecification on the estimation of the common factors is negligible for large sample size (T) and the cross-sectional dimension (n). The estimator is feasible when n is large and easily implementable using the Kalman smoother and the EM algorithm as in traditional factor analysis. Simulation results illustrate what are the empirical conditions in which we can expect improvement with respect to simple principle components considered by Bai (2003), Bai and Ng (2002), Forni, Hallin, Lippi, and Reichlin (2000, 2005b), Stock and Watson (2002a,b).
    Keywords: factor model; large cross-sections; Quasi Maximum Likelihood
    JEL: C32 C33 C51
    Date: 2006–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5724&r=ets
  10. By: Giannone, Domenico; Reichlin, Lucrezia
    Abstract: This paper asks two questions. First, can we detect empirically whether the shocks recovered from the estimates of a structural VAR are truly structural? Second, can the problem of non-fundamentalness be solved by considering additional information? The answer to the first question is 'yes' and that to the second is 'under some conditions'.
    Keywords: identification; information; invertibility; structural VAR
    JEL: C32 C33 E00 E32 O3
    Date: 2006–06
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5725&r=ets
  11. By: Benati, Luca
    Abstract: We use tests for multiple breaks at unknown points in the sample, and the Stock-Watson (1996, 1998) time-varying parameters median-unbiased estimation methodology, to investigate changes in the equilibrium rate of growth of labor productivity–both per hour and per worker–in the United States, the Eurozone Australia, and Japan over the post-WWII era. Results for the U.S. well capture the 'conventional wisdom’ of a golden era of high productivity growth, the 1950s and 1960s; a marked deceleration starting from the beginning of the 1970s; and a strong growth resurgence starting from mid-1990s. Interestingly, evidence suggests the 1990s’ productivity acceleration to have reached a plateau over the last few years. Results for the Eurozone point towards a marked deceleration since the beginning of the 1980s, with the equilibrium rate of growth of output per hour falling to 0.9% in 2004:4. Results based on Cochrane’s variance ratio estimator suggest a non-negligible fraction of the quarter-on-quarter change in labor productivity growth to be permanent. From a technical point of view, we propose a new method for constructing confidence intervals for variance ratio estimates based on spectral bootstrapping. Preliminary Monte Carlo evidence suggests such a method to possess good coverage properties.
    Keywords: bootstrapping; frequency domain; median-unbiased estimation; Monte Carlo integration; structural break tests; time-varying parameters; variance ratio
    JEL: E30 E32
    Date: 2006–08
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5801&r=ets
  12. By: De Mol, Christine; Giannone, Domenico; Reichlin, Lucrezia
    Abstract: This paper considers Bayesian regression with normal and double exponential priors as forecasting methods based on large panels of time series. We show that, empirically, these forecasts are highly correlated with principal component forecasts and that they perform equally well for a wide range of prior choices. Moreover, we study the asymptotic properties of the Bayesian regression under Gaussian prior under the assumption that data are quasi collinear to establish a criterion for setting parameters in a large cross-section.
    Keywords: Bayesian VAR; large cross-sections; Lasso regression; principal components; ridge regressions
    JEL: C11 C13 C33 C53
    Date: 2006–09
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:5829&r=ets
  13. By: Esther Ruiz; Helena Veiga
    Abstract: In this paper, we propose a new stochastic volatility model, called A-LMSV, to cope simultaneously with the leverage effect and long-memory. We derive its statistical properties and compare them with the properties of the FIEGARCH model. We show that the dependence of the autocorrelations of squares on the parameters measuring the asymmetry and the persistence is different in both models. The kurtosis and autocorrelations of squares do not depend on the asymmetry in the A-LMSV model while they increase with the asymmetry in the FIEGARCH model. Furthermore, the autocorrelations of squares increase with the persistence in the A-LMSV model and decrease in the FIEGARCH model. On the other hand, the autocorrelations of absolute returns increase with the magnitude of the asymmetry in the FIEGARCH model while they can increase or decrease depending on the sign of the asymmetry in the L-MSV model. Finally, the cross-correlations between squares and original observations are, in general, larger in the FIEGARCH model than in the ALMSV model. The results are illustrated by fitting both models to represent the dynamic evolution of volatilities of daily returns of the S&P500 and DAX indexes.
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws066016&r=ets
  14. By: Ke-Li Xu (Dept. of Economics, Yale University); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: Stable autoregressive models of known finite order are considered with martingale differences errors scaled by an unknown nonparametric time-varying function generating heterogeneity. An important special case involves structural change in the error variance, but in most practical cases the pattern of variance change over time is unknown and may involve shifts at unknown discrete points in time, continuous evolution or combinations of the two. This paper develops kernel-based estimators of the residual variances and associated adaptive least squares (ALS) estimators of the autoregressive coefficients. These are shown to be asymptotically efficient, having the same limit distribution as the infeasible generalized least squares (GLS). Comparisons of the efficient procedure and the ordinary least squares (OLS) reveal that least squares can be extremely inefficient in some cases while nearly optimal in others. Simulations show that, when least squares work well, the adaptive estimators perform comparably well, whereas when least squares work poorly, major efficiency gains are achieved by the new estimators.
    Keywords: Adaptive estimation, Autoregression, Heterogeneity, Weighted regression
    JEL: C14 C22
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1585&r=ets
  15. By: Offer Lieberman (Technion-Israel Institute of Technology); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: An infinite-order asymptotic expansion is given for the autocovariance function of a general stationary long-memory process with memory parameter d in (-1/2,1/2). The class of spectral densities considered includes as a special case the stationary and invertible ARFIMA(p,d,q) model. The leading term of the expansion is of the order O(1/k^{1-2d}), where k is the autocovariance order, consistent with the well known power law decay for such processes, and is shown to be accurate to an error of O(1/k^{3-2d}). The derivation uses Erdélyi's (1956) expansion for Fourier-type integrals when there are critical points at the boundaries of the range of integration - here the frequencies {0,2}. Numerical evaluations show that the expansion is accurate even for small k in cases where the autocovariance sequence decays monotonically, and in other cases for moderate to large k. The approximations are easy to compute across a variety of parameter values and models.
    Keywords: Autocovariance, Asymptotic expansion, Critical point, Fourier integral, Long memory
    JEL: C13 C22
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1586&r=ets
  16. By: Chang Sik Kim (Dept. of Economics, Ewha Women's University); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: Estimation of the memory parameter (d) is considered for models of nonstationary fractionally integrated time series with d > (1/2). It is shown that the log periodogram regression estimator of d is inconsistent when 1 < d < 2 and is consistent when (1/2) < d = 1. For d > 1, the estimator is shown to converge in probability to unity.
    Keywords: Discrete Fourier transform, Fractional Brownian motion, Fractional integration, Inconsistency, Log periodogram regression, Long memory parameter, Nonstationarity, Semiparametric estimation
    JEL: C22
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1587&r=ets
  17. By: José Fajardo (IBMEC Business School - Rio de Janeiro); Ernesto Mordecki (Centro de Matemática, Facultad de Ciências, Universidad de la República, Uruguay)
    Abstract: We study the skewness premium (SK) introduced by Bates (1991) in a general context using Lévy Processes. We obtain sufficient and necessary conditions for Bate's x% rule to hold. Then, we derive sufficient conditions for SK to be positive, in terms of the characteristic triplet of the Lévy Process under the risk neutral measure.
    Keywords: Skewness Premium, Lévy processes
    JEL: C52 G10
    Date: 2006–10–24
    URL: http://d.repec.org/n?u=RePEc:ibr:dpaper:2006-04&r=ets
  18. By: Tatsuma Wada (Department of Economics, Boston University); Pierre Perron (Department of Economics, Boston University)
    Abstract: This paper first generalizes the trend-cycle decomposition framework of Perron and Wada (2005) based on an unobserved components models with innovations having a mixtures of Normals distribution, which is able to handle sudden level and slope changes to the trend function as well as outliers. We investigate how important are the differences in the implied trend and cycle compared to the popular decomposition based on the Hodrick and Prescott (HP) (1997) filter. Our results show important qualitative and quantitative differences in the implied cycles for both real GDP and consumption series for the G7 countries. Most of the differences can be ascribed to the fact that the HP filter does not handle well slope changes, level shifts and outliers, while our method does so. Third, we assess how such different cycles affect some socalled “stylized facts” about the relative variability of consumption and output across countries. Our results show again important differences. In particular, the crosscountry consumption correlations are generally higher than the output correlations, except for the period from 1975 to 1985, provided Canada is excluded. Our results therefore provide a partial solution to this puzzle. The evidence is particularly strong for the most recent period.
    Keywords: Trend-Cycle Decomposition, Unobserved Components Model, International Business Cycle, Non Gaussian Filter.
    JEL: C22 E32
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:bos:wpaper:wp2005-44&r=ets

This nep-ets issue is ©2006 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.