
on Econometric Time Series 
By:  Melvin J. Hinich; John Foster; Philip Wild (School of Economics, The University of Queensland) 
Abstract:  The purpose of this article is to investigate the ability of bandpass filters commonly used in economics to extract a known periodicity. The specific bandpass filters investigated include a Discrete Fourier Transform (DFT) filter, together with those proposed by Hodrick and Prescott (1997) and Baxter and King (1999). Our focus on the cycle extraction properties of these filters reflects the lack of attention that has been given to this issue in the literature, when compared, for example, to studies of the trend removal properties of some of these filters. The artificial data series we use are designed so that one periodicity deliberately falls within the passband while another falls outside. The objective of a filter is to admit the ‘bandpass’ periodicity while excluding the periodicity that falls outside the passband range. We find that the DFT filter has the best extraction properties. The filtered data series produced by both the HodrickPrescott and BaxterKing filters are found to admit low frequency components that should have been excluded. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:qld:uq2004:358&r=ets 
By:  Jan J.J. Groen (Federal Reserve Bank of New York); George Kapetanios (Queen Mary, University of London) 
Abstract:  This paper revisits a number of datarich prediction methods, like factor models, Bayesian ridge regression and forecast combinations, which are widely used in macroeconomic forecasting, and compares these with a lesser known alternative method: partial least squares regression. Under the latter, linear, orthogonal combinations of a large number of predictor variables are constructed such that these linear combinations maximize the covariance between the target variable and each of the common components constructed from the predictor variables. We provide a theorem that shows that when the data comply with a factor structure, principal components and partial least squares regressions provide asymptotically similar results. We also argue that forecast combinations can be interpreted as a restricted form of partial least squares regression. Monte Carlo experiments confirm our theoretical result that principal components and partial least squares regressions are asymptotically similar when the data has a factor structure. These experiments also indicate that when there is no factor structure in the data, partial least squares regression outperforms both principal components and Bayesian ridge regressions. Finally, we apply partial least squares, principal components and Bayesian ridge regressions on a large panel of monthly U.S. macroeconomic and financial data to forecast, for the United States, CPI inflation, core CPI inflation, industrial production, unemployment and the federal funds rate across different subperiods. The results indicate that partial least squares regression usually has the best outofsample performance relative to the two other datarich prediction methods. 
Keywords:  Macroeconomic forecasting, Factor models, Forecast combination, Principal components, Partial least squares, (Bayesian) ridge regression 
JEL:  C22 C53 E37 E47 
Date:  2008–03 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp624&r=ets 
By:  Andros Kourtellos; Thanasis Stengos; Chih Ming Tan 
Abstract:  This paper extends the simple threshold regression framework of Hansen (2000) and Caner and Hansen (2004) to allow for endogeneity of the threshold variable. We develop a concentrated twostage least squares (C2SLS) estimator of the threshold parameter that is based on an inverse Mills ratio bias correction. Our method also allows for the endogeneity of the slope variables. We show that our estimator is consistent and investigate its performance using a Monte Carlo simulation that indicates the applicability of the method is finite samples. We also illustrate its usefulness with an empirical example from economic growth. 
Date:  2008–03 
URL:  http://d.repec.org/n?u=RePEc:ucy:cypeua:32008&r=ets 
By:  M. Hashem Pesaran, L. Vanessa Smith, Takashi Yamagata 
Abstract:  This paper extends the cross sectionally augmented panel unit root test proposed by Pesaran (2007) to the case of a multifactor structure. The basic idea is to exploit information regarding the unobserved factors that are shared by other time series in addition to the variable under consideration. Importantly, our test procedure only requires specification of the maximum number of factors, in contrast to other panel unit root tests based on principal components that require in addition the estimation of the number of factors as well as the factors themselves. Small sample properties of the proposed test are investigated by Monte Carlo experiments, which suggest that it controls well for size in almost all cases, especially in the presence of serial correlation in the error term, contrary to alternative test statistics. Empirical applications to Fisher's inflation parity and real equity prices across different markets illustrate how the proposed test works in practice. 
Keywords:  Panel unit root tests, Cross Section Dependence, Multifactor Residual Structure, Fisher Inflation Parity, Real Equity Prices. 
JEL:  C12 C15 C22 C23 
Date:  2008–03 
URL:  http://d.repec.org/n?u=RePEc:yor:yorken:08/03&r=ets 
By:  Neil Shephard; Torben G. Andersen 
Abstract:  In this paper we review the history and recent developments of stochastic volatility, which is the main way financial economists and mathematical finance specialists model time varying volatility. 
JEL:  C01 C14 C32 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:oxf:wpaper:389&r=ets 
By:  Ting Qin; Walter Enders (Department of Economics, St. Cloud State University) 
Abstract:  A key feature of Gallant’s Flexible Fourier Form is that the essential characteristics of one or more structural breaks can be captured using a small number of low frequency components from a Fourier approximation. We introduce a variant of the Flexible Fourier Form into the trend function of U.S. real GDP in order to allow for gradual effects of unknown numbers of structural breaks occurring at unknown dates. We find that the Fourier components are significant and that there are multiple breaks in the trend. In addition to the productivity slowdown in the 1970s, our trend also captures a productivity resumption in the late 1990s and a slowdown in the late 1950s. Our cycle corresponds very closely to the NBER chronology. We compare the decomposition from our model with those from a standard unobserved components model, the HP filter, and the Perron and Wada (2005) model. We find that our decomposition has several favorable characteristics over the other models and has very different implications about the recovery from the recent recession. 
Keywords:  Flexible Fourier Form, Smooth Trend Breaks, Fourier Approximation 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:scs:wpaper:0809&r=ets 
By:  Konstantinos Nikolopoulos; Vassilios Assimakopoulos; Nikolaos Bougioukos; Fotios Petropoulos 
Abstract:  The Theta model created a lot of interest in academic circles due to its surprising performance in the M3competition. However, this interest was not followed by a large number of studies, with the exception of Hyndman and Billah in 2003. The present study discusses the advances in the model that have been made in the last five years and attempts to provide further insights into the research question: “Is the Theta model just a special case of Simple Exponential Smoothing with drift (SESd)?” If we do not use equally weighted extrapolations of two specific Theta Lines, L(T=0) and L(T=2) in the Theta model then we end up with a far more generic model than Simple Exponential Smoothing. The paper also examines the potential of an optimization version of SESd so as to test the results of Hyndman and Billah. In contrast to their research results, the Theta model outperforms SESd in the QuarterlyM3 and OtherM3 subsets by 0.30% and 0.36% respectively, when the Symmetric Mean Absolute Percentage Error is used to measure accuracy. 
Keywords:  Decomposition, Extrapolation, Theta model, Exponential Smoothing, M3Competition. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:uop:wpaper:0023&r=ets 
By:  Claude Lopez 
Abstract:  This paper proposes a version of the DFGLS test that incorporates up to two breaks in the intercept, namely the DFGLSTB test. While the asymptotic properties of the DFGLS test remain valid, the presence of changes in the intercept has an impact on the small sample properties of the test. Hence, ?nite sample critical values for the DFGLSTB test are tabulated while aMonte Carlo study highlights its enhanced power. 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:cin:ucecwp:200801&r=ets 