nep-ets New Economics Papers
on Econometric Time Series
Issue of 2008‒03‒25
eight papers chosen by
Yong Yin
SUNY at Buffalo

  1. An Investigation of the Cycle Extraction Properties of Several Bandpass Filters Used to Identify Business Cycles By Melvin J. Hinich; John Foster; Philip Wild
  2. The Elusive Persistence: Revisiting Useful Approaches to Data-Rich Macroeconomic Forecasting By Jan J.J. Groen; George Kapetanios
  3. THRET: Threshold Regression with Endogenous Threshold Variables By Andros Kourtellos; Thanasis Stengos; Chih Ming Tan
  4. Panel Unit Root Tests in the Presence of a Multifactor Error Structure By M. Hashem Pesaran, L. Vanessa Smith, Takashi Yamagata
  5. Stochastic Volatility: Origins and Overview By Neil Shephard; Torben G. Andersen
  6. Modeling Smooth Structural Changes in the Trend of US Real GDP By Ting Qin; Walter Enders
  7. Advances in the Theta model By Konstantinos Nikolopoulos; Vassilios Assimakopoulos; Nikolaos Bougioukos; Fotios Petropoulos
  8. GLS-detrending and Regime-wise Stationarity Testing in Small Samples By Claude Lopez

  1. By: Melvin J. Hinich; John Foster; Philip Wild (School of Economics, The University of Queensland)
    Abstract: The purpose of this article is to investigate the ability of bandpass filters commonly used in economics to extract a known periodicity. The specific bandpass filters investigated include a Discrete Fourier Transform (DFT) filter, together with those proposed by Hodrick and Prescott (1997) and Baxter and King (1999). Our focus on the cycle extraction properties of these filters reflects the lack of attention that has been given to this issue in the literature, when compared, for example, to studies of the trend removal properties of some of these filters. The artificial data series we use are designed so that one periodicity deliberately falls within the passband while another falls outside. The objective of a filter is to admit the ‘bandpass’ periodicity while excluding the periodicity that falls outside the passband range. We find that the DFT filter has the best extraction properties. The filtered data series produced by both the Hodrick-Prescott and Baxter-King filters are found to admit low frequency components that should have been excluded.
    Date: 2008
  2. By: Jan J.J. Groen (Federal Reserve Bank of New York); George Kapetanios (Queen Mary, University of London)
    Abstract: This paper revisits a number of data-rich prediction methods, like factor models, Bayesian ridge regression and forecast combinations, which are widely used in macroeconomic forecasting, and compares these with a lesser known alternative method: partial least squares regression. Under the latter, linear, orthogonal combinations of a large number of predictor variables are constructed such that these linear combinations maximize the covariance between the target variable and each of the common components constructed from the predictor variables. We provide a theorem that shows that when the data comply with a factor structure, principal components and partial least squares regressions provide asymptotically similar results. We also argue that forecast combinations can be interpreted as a restricted form of partial least squares regression. Monte Carlo experiments confirm our theoretical result that principal components and partial least squares regressions are asymptotically similar when the data has a factor structure. These experiments also indicate that when there is no factor structure in the data, partial least squares regression outperforms both principal components and Bayesian ridge regressions. Finally, we apply partial least squares, principal components and Bayesian ridge regressions on a large panel of monthly U.S. macroeconomic and financial data to forecast, for the United States, CPI inflation, core CPI inflation, industrial production, unemployment and the federal funds rate across different sub-periods. The results indicate that partial least squares regression usually has the best out-of-sample performance relative to the two other data-rich prediction methods.
    Keywords: Macroeconomic forecasting, Factor models, Forecast combination, Principal components, Partial least squares, (Bayesian) ridge regression
    JEL: C22 C53 E37 E47
    Date: 2008–03
  3. By: Andros Kourtellos; Thanasis Stengos; Chih Ming Tan
    Abstract: This paper extends the simple threshold regression framework of Hansen (2000) and Caner and Hansen (2004) to allow for endogeneity of the threshold variable. We develop a concentrated two-stage least squares (C2SLS) estimator of the threshold parameter that is based on an inverse Mills ratio bias correction. Our method also allows for the endogeneity of the slope variables. We show that our estimator is consistent and investigate its performance using a Monte Carlo simulation that indicates the applicability of the method is finite samples. We also illustrate its usefulness with an empirical example from economic growth.
    Date: 2008–03
  4. By: M. Hashem Pesaran, L. Vanessa Smith, Takashi Yamagata
    Abstract: This paper extends the cross sectionally augmented panel unit root test proposed by Pesaran (2007) to the case of a multifactor structure. The basic idea is to exploit information regarding the unobserved factors that are shared by other time series in addition to the variable under consideration. Importantly, our test procedure only requires specification of the maximum number of factors, in contrast to other panel unit root tests based on principal components that require in addition the estimation of the number of factors as well as the factors themselves. Small sample properties of the proposed test are investigated by Monte Carlo experiments, which suggest that it controls well for size in almost all cases, especially in the presence of serial correlation in the error term, contrary to alternative test statistics. Empirical applications to Fisher's inflation parity and real equity prices across different markets illustrate how the proposed test works in practice.
    Keywords: Panel unit root tests, Cross Section Dependence, Multi-factor Residual Structure, Fisher Inflation Parity, Real Equity Prices.
    JEL: C12 C15 C22 C23
    Date: 2008–03
  5. By: Neil Shephard; Torben G. Andersen
    Abstract: In this paper we review the history and recent developments of stochastic volatility, which is the main way financial economists and mathematical finance specialists model time varying volatility.
    JEL: C01 C14 C32
    Date: 2008
  6. By: Ting Qin; Walter Enders (Department of Economics, St. Cloud State University)
    Abstract: A key feature of Gallant’s Flexible Fourier Form is that the essential characteristics of one or more structural breaks can be captured using a small number of low frequency components from a Fourier approximation. We introduce a variant of the Flexible Fourier Form into the trend function of U.S. real GDP in order to allow for gradual effects of unknown numbers of structural breaks occurring at unknown dates. We find that the Fourier components are significant and that there are multiple breaks in the trend. In addition to the productivity slowdown in the 1970s, our trend also captures a productivity resumption in the late 1990s and a slowdown in the late 1950s. Our cycle corresponds very closely to the NBER chronology. We compare the decomposition from our model with those from a standard unobserved components model, the HP filter, and the Perron and Wada (2005) model. We find that our decomposition has several favorable characteristics over the other models and has very different implications about the recovery from the recent recession.
    Keywords: Flexible Fourier Form, Smooth Trend Breaks, Fourier Approximation
    Date: 2007–01
  7. By: Konstantinos Nikolopoulos; Vassilios Assimakopoulos; Nikolaos Bougioukos; Fotios Petropoulos
    Abstract: The Theta model created a lot of interest in academic circles due to its surprising performance in the M3-competition. However, this interest was not followed by a large number of studies, with the exception of Hyndman and Billah in 2003. The present study discusses the advances in the model that have been made in the last five years and attempts to provide further insights into the research question: “Is the Theta model just a special case of Simple Exponential Smoothing with drift (SES-d)?” If we do not use equally weighted extrapolations of two specific Theta Lines, L(T=0) and L(T=2) in the Theta model then we end up with a far more generic model than Simple Exponential Smoothing. The paper also examines the potential of an optimization version of SES-d so as to test the results of Hyndman and Billah. In contrast to their research results, the Theta model outperforms SES-d in the Quarterly-M3 and Other-M3 subsets by 0.30% and 0.36% respectively, when the Symmetric Mean Absolute Percentage Error is used to measure accuracy.
    Keywords: Decomposition, Extrapolation, Theta model, Exponential Smoothing, M3-Competition.
    Date: 2008
  8. By: Claude Lopez
    Abstract: This paper proposes a version of the DF-GLS test that incorporates up to two breaks in the intercept, namely the DF-GLSTB test. While the asymptotic properties of the DF-GLS test remain valid, the presence of changes in the intercept has an impact on the small sample properties of the test. Hence, ?nite sample critical values for the DF-GLSTB test are tabulated while aMonte Carlo study highlights its enhanced power.
    Date: 2008

This nep-ets issue is ©2008 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.