
on Econometrics 
By:  Norman Swanson (Rutgers University); Valentina Corradi (Queen Mary, University of London) 
Abstract:  Our objectives in this paper are twofold. First, we introduce block bootstrap techniques that are (first order) valid in recursive estimation frameworks. Thereafter, we present two examples where predictive accuracy tests are made operational using our new bootstrap procedures. In one application, we outline a consistent test for outofsample nonlinear Granger causality, and in the other we outline a test for selecting amongst multiple alternative forecasting models, all of which are possibly misspecified. More specifically, our examples extend the White (2000) reality check to the case of non vanishing parameter estimation error, and extend the integrated conditional moment tests of Bierens (1982, 1990) and Bierens and Ploberger (1997) to the case of outofsample prediction. In both examples, appropriate recentering of the bootstrap score is required in order to ensure that the tests have asymptotically correct size, and the need for such recentering is shown to arise quite naturally when testing hypotheses of predictive accuracy. In a Monte Carlo investigation, we compare the finite sample properties of our block bootstrap procedures with the parametric bootstrap due to Kilian (1999); all within the context of various encompassing and predictive accuracy tests. An empirical illustration is also discussed, in which it is found that unemployment appears to have nonlinear marginal predictive content for inflation. 
Keywords:  block bootstrap, nonlinear causality, parameter estimation error, reality check, recursive estimation scheme 
JEL:  C22 C51 
Date:  2006–09–22 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:200618&r=ecm 
By:  Valentina Corradi (Queen Mary, University of London); Norman Swanson (Rutgers University); Geetesh Bhardwaj (Rutgerst University) 
Abstract:  This paper makes two contributions. First, we outline a simple simulation based framework for constructing conditional distributions for multifactor and multidimensional diffusion processes, for the case where the functional form of the conditional density is unknown. The distributions can be used, for example, to form conditional confidence intervals for time period t + ¥ó , say, given information up to period t. Second, we use the simulation based approach to construct a test for the correct specification of a diffusion process. The suggested test is in the spirit of the conditional Kolmogorov test of Andrews (1997). However, in the present context the null conditional distribution is unknown and is replaced by its simulated counterpart. The limiting distribution of the test statistic is not nuisance parameter free. In light of this, asymptotically valid critical values are obtained via appropriate use of the block bootstrap. The suggested test has power against a larger class of alternatives than tests that are constructed using marginal distributions/densities, such as those in A¡§©¥tSahalia (1996) and Corradi and Swanson (2005). The findings of a small Monte Carlo experiment underscore the good finite sample properties of the proposed test, and an empirical illustration underscores the ease with which the proposed simulation and testing methodology can be applied. 
Keywords:  block bootstrap, diffusion processes, parameter estimation error, simulated GMM, stochastic volatility 
JEL:  C22 C51 
Date:  2006–09–22 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:200614&r=ecm 
By:  George Kapetanios (Queen Mary, University of London); Zacharias Psaradakis (Birkbeck, University of London) 
Abstract:  This paper considers the problem of statistical inference in linear regression models whose stochastic regressors and errors may exhibit longrange dependence. A timedomain sievetype generalized least squares (GLS) procedure is proposed based on an autoregressive approximation to the generating mechanism of the errors. The asymptotic properties of the sievetype GLS estimator are established. A Monte Carlo study examines the finitesample properties of the method for testing regression hypotheses. 
Keywords:  Autoregressive approximation, Generalized least squares, Linear regression, Longrange dependence, Spectral density 
JEL:  C12 C13 C22 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp587&r=ecm 
By:  Valentina Corradi (Queen Mary, University of London); Norman Swanson (Rutgers University); Walter Distaso (Imperial College) 
Abstract:  In recent years, numerous volatilitybased derivative products have been engineered. This has led to interest in constructing conditional predictive densities and confidence intervals for integrated volatility. In this paper, we propose nonparametric kernel estimators of the aforementioned quantities. The kernel functions used in our analysis are based on different realized volatility measures, which are constructed using the ex post variation of asset prices. A set of sufficient conditions under which the estimators are asymptotically equivalent to their unfeasible counterparts, based on the unobservable volatility process, is provided. Asymptotic normality is also established. The efficacy of the estimators is examined via Monte Carlo experimentation, and an empirical illustration based upon data from the New York Stock Exchange is provided. 
Keywords:  conditional confidence intervals, Diffusions, integrated volatility, kernels, microstructure noise, realized volatility measures 
JEL:  C14 C22 C53 
Date:  2006–09–22 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:200616&r=ecm 
By:  Norman Swanson (Rutgers University); Nii Ayi Armah (Rutgers University) 
Abstract:  In this chapter we discuss model selection and predictive accuracy tests in the context of parameter and model uncertainty under recursive and rolling estimation schemes. We begin by summarizing some recent theoretical findings, with particular emphasis on the construction of valid bootstrap procedures for calculating the impact of parameter estimation error on the class of test statistics with limiting distributions that are functionals of Gaussian processes with covariance kernels that are dependent upon parameter and model uncertainty. We then provide an example of a particular test which falls in this class. Namely, we outline the socalled Corradi and Swanson (CS: 2002) test of (non)linear outofsample Granger causality. Thereafter, we carry out a series of Monte Carlo experiments examining the properties of the CS and a variety of other related predictive accuracy and model selection type tests. Finally, we present the results of an empirical investigation of the marginal predictive content of money for income, in the spirit of Stock andWatson (1989), Swanson (1998), Amato and Swanson (2001), and the references cited therein. We find that there is evidence of predictive causation when insample estimation periods are ended any time during the 1980s, but less evidence during the 1970s. Furthermore, recursive estimation windows yield better prediction models when prediction periods begin in the 1980s, while rolling estimation windows yield better models when prediction periods begin during the 1970s and 1990s. Interestingly, these two results can be combined into a coherent picture of what is driving our empirical results. Namely, when recursive estimation windows yield lower overall predictive MSEs, then bigger prediction models that include money are preferred, while smaller models without money are preferred when rolling models yield the lowest MSE predictors. 
Keywords:  block bootstrap, forecasting, nonlinear causality, recursive estimation scheme, rolling estimation schememodel misspecification 
JEL:  C22 C51 
Date:  2006–09–22 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:200619&r=ecm 
By:  Fabio Busetti (Bank of Italy); Andrew Harvey (Cambridge University) 
Abstract:  The paper examines various tests for assessing whether a time series model requires a slope component. We first consider the simple ttest on the mean of first differences and show that it achieves high power against the alternative hypothesis of a stochastic nonstationary slope as well as against a purely deterministic slope. The test may be modified, parametrically or nonparametrically to deal with serial correlation. Using both local limiting power arguments and finite sample Monte Carlo results, we compare the ttest with the nonparametric tests of Vogelsang (1998) and with a modified stationarity test. Overall the ttest seems a good choice, particularly if it is implemented by fitting a parametric model to the data. When standardized by the square root of the sample size, the simple tstatistic, with no correction for serial correlation, has a limiting distribution if the slope is stochastic. We investigate whether it is a viable test for the null hypothesis of a stochastic slope and conclude that its value may be limited by an inability to reject a small deterministic slope. Empirical illustrations are provided using series of relative prices in the euroarea and data on global temperature. 
Keywords:  Cramérvon Mises distribution, stationarity test, stochastic trend, unit root, unobserved component. 
JEL:  C22 C52 
URL:  http://d.repec.org/n?u=RePEc:bdi:wptemi:td_614_07&r=ecm 
By:  Valentina Corradi (Queen Mary, University of London); Norman Swanson (Rutgers University); Walter Distaso (Imperial College) 
Abstract:  The main objective of this paper is to propose a feasible, model free estimator of the predictive density of integrated volatility. In this sense, we extend recent papers by Andersen, Bollerslev, Diebold and Labys (2003), and by Andersen, Bollerslev and Meddahi (2004, 2005), who address the issue of pointwise prediction of volatility via ARMA models, based on the use of realized volatility. Our approach is to use a realized volatility measure to construct a non parametric (kernel) estimator of the predictive density of daily volatility. We show that, by choosing an appropriate realized measure, one can achieve consistent estimation, even in the presence of jumps and microstructure noise in prices. More precisely, we establish that four well known realized measures, i.e. realized volatility, bipower variation, and two measures robust to microstructure noise, satisfy the conditions required for the uniform consistency of our estimator. Furthermore, we outline an alternative simulation based approach to predictive density construction. Finally, we carry out a simulation experiment in order to assess the accuracy of our estimators, and provide an empirical illustration that underscores the importance of using microstructure robust measures when using high frequency data. 
Keywords:  Diffusions, integrated volatility, kernels, microstructure noise, realized volatility measures 
JEL:  C14 C22 C53 
Date:  2006–10–02 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:200620&r=ecm 
By:  Kakamu, Kazuhiko (Graduate School of Economics, Osaka University, Osaka, Japan); Polasek, Wolfgang (Department of Economics and Finance, Institute for Advanced Studies, Vienna, Austria) 
Abstract:  We suggest a new class of crosssectional spacetime models based on local AR models and nearest neighbors using distances between observations. For the estimation we use a tightness prior for prediction of regional GDP forecasts. We extend the model to the model with exogenous variable model and hierarchical prior models. The approaches are demonstrated for a dynamic panel model for regional data in Central Europe. Finally, we find that an ARNN(1, 3) model with travel time data is best selected by marginal likelihood and there the spatial correlation is usually stronger than the time correlation. 
Keywords:  Dynamic panel data, hierarchical models, marginal likelihoods, nearest neighbors, tightness prio, spatial econometrics 
JEL:  C11 C15 C21 R11 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:ihs:ihsesp:203&r=ecm 
By:  Yasuhiro Omori (Faculty of Economics, University of Tokyo) 
Abstract:  We consider Bayesian estimation of a sample selection model and propose a highly efficient Gibbs sampler using the additional scale transformation step to speed up the convergence to the posterior distribution. Numerical examples are given to show the efficiency of our proposed sampler. 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2007cf481&r=ecm 
By:  Jan R. Magnus (Department of Econometrics & OR, Tilburg University) 
Abstract:  We present an analytical closedform expression for the asymptotic variance matrix in the misspecified multivariate regression model. 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:tky:fseres:2007cf479&r=ecm 
By:  Giuliano De Rossi; Andrew Harvey 
Abstract:  A timevarying quantile can be fitted to a sequence of observations by formulating a time series model for the corresponding population quantile and iteratively applying a suitably modified state space signal extraction algorithm. It is shown that such timevarying quantiles satisfy the defining property of fixed quantiles in having the appropriate number of observations above and below. Expectiles are similar to quantiles except that they are defined by tail expectations. Like quantiles, timevarying expectiles can be estimated by a state space signal extraction algorithm and they satisfy properties that generalize the moment conditions associated with fixed expectiles. Timevarying quantiles and expectiles provide information on various aspects of a time series, such as dispersion and asymmetry, while estimates at the end of the series provide the basis for forecasting. Because the state space form can handle irregularly spaced observations, the proposed algorithms can be easily adapted to provide a viable means of computing splinebased nonparametric quantile and expectile regressions. 
Keywords:  Asymmetric least squares; cubic splines; dispersion; nonparametric regression; quantile regression; signal extraction; state space smoother. 
JEL:  C14 C22 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:0702&r=ecm 
By:  George Kapetanios (Queen Mary, University of London); Andrew P. Blake (Bank of England) 
Abstract:  This paper develops theoretical results for the estimation of radial basis function neural network specifications, for dependent data, that do not require iterative estimation techniques. Use of the properties of regression based boosting algorithms is made. Both consistency and rate results are derived. An application to nonparametric specification testing illustrates the usefulness of the results. 
Keywords:  Neural Networks, Boosting 
JEL:  C12 C13 C22 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp588&r=ecm 
By:  John C. Frain (Department of Economics, Trinity College) 
Abstract:  This paper is a MonteCarlo study of the small sample power of six tests of a normality hypotheses when the alternative is an alphastable distribution with param eter values similar to those estimated for monthly total returns on equity indices. In these circumstances a sample size of 2oo is required to detect departures from normality. In most cases only small samples of consistent monthly data on such to tal returns are available and these are not sufficient to differentiate between normal and alphastable distributions. 
JEL:  C12 C16 C46 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:tcd:tcduee:tep0207&r=ecm 
By:  Fabio Busetti; Andrew Harvey 
Abstract:  Quantiles provide a comprehensive description of the properties of a variable and tracking changes in quantiles over time using signal extraction methods can be informative. It is shown here how stationarity tests can be generalized to test the null hypothesis that a particular quantile is constant over time by using weighted indicators. Corresponding tests based on expectiles are also proposed; these might be expected to be more powerful for distributions that are not heavytailed. Tests for changing dispersion and asymmetry may be based on contrasts between particular quantiles or expectiles. We report Monte Carlo experiments investigating the e¤ectiveness of the proposed tests and then move on to consider how to test for relative time invariance, based on residuals from fitting a timevarying level or trend. Empirical examples, using stock returns and U.S. inflation, provide an indication of the practical importance of the tests. 
Keywords:  Dispersion; expectiles; quantiles; skewness; stationarity tests; stochastic volatility, value at risk. 
JEL:  C12 C22 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:0701&r=ecm 
By:  Adrian Pagan; M. Hashem Pesaran 
Abstract:  This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the in.uential work of Blanchard and Quah (1989), and shows that structural equations for which there are known permanent shocks must have no error correction terms present in them, thereby freeing up the latter to be used as instruments in estimating their parameters. The proposed approach is illustrated by a reexamination of the identification scheme used in a monetary model by Wickens and Motta (2001), and in a well known paper by Gali (1992) which deals with the construction of an ISLM model with supplyside e¤ects. We show that the latter imposes more shortrun restrictions than are needed because of a failure to fully utilize the cointegration information. 
Keywords:  Permanent shocks, structural identification, error correction models, ISLM models. 
JEL:  C30 C32 E10 
Date:  2007–01 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:0704&r=ecm 
By:  Elliott, Graham; Timmermann, Allan G 
Abstract:  Forecasts guide decisions in all areas of economics and finance and their value can only be understood in relation to, and in the context of, such decisions. We discuss the central role of the loss function in helping determine the forecaster's objectives and use this to present a unified framework for both the construction and evaluation of forecasts. Challenges arise from the explosion in the sheer volume of predictor variables under consideration and the forecaster's ability to entertain an endless array of functional forms and timevarying specifications, none of which may coincide with the `true' model. Methods for comparing the forecasting performance of pairs of models or evaluating the ability of the best of many models to beat a benchmark specification are also reviewed. 
Keywords:  economic forecasting; forecast evaluation; loss function 
JEL:  C53 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:6158&r=ecm 
By:  Heejoon Kang (Department of Business Economics and Public Policy, Indiana University Kelley School of Business) 
Abstract:  The empirical literature is abundant with detrended cointegration, where cointegration relationships are tested and estimated with deterministic trend terms. Cointegration is, however, critically dependent on whether time series is detrended or not. A series of Monte Carlo experiments show that inappropriately detrended time series tend to exhibit a spurious cointegration. Although true time series are known not to be cointegrated, inappropriately detrended series tend to be cointegrated. Foreign exchange rates are analyzed to demonstrate the relevance and importance of the inappropriate detrending in the cointegration analysis. 
Keywords:  Deterministic trend, Foreign exchange rates, Monte Carlo study, Stochastic trend 
JEL:  C22 C15 E31 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:iuk:wpaper:200614&r=ecm 
By:  Nicholas Longford 
Abstract:  We compare a set of empirical Bayes and composite estimators of the population means of the districts (small areas) of a country, and show that the natural modelling strategy of searching for a well fitting empirical Bayes model and using it for estimation of the arealevel means can be inefficient. 
Keywords:  Composite estimator, empirical Bayes models, mean squared error, smallarea estimation 
JEL:  C15 
Date:  2006–11 
URL:  http://d.repec.org/n?u=RePEc:upf:upfgen:995&r=ecm 
By:  Eric Rasmusen (Department of Business Economics and Public Policy, Indiana University Kelley School of) 
Abstract:  This is an exposition of the BLP method of structural demand estimation using the randomcoefficients logit model. 
JEL:  L0 
Date:  2006–03 
URL:  http://d.repec.org/n?u=RePEc:iuk:wpaper:200604&r=ecm 
By:  Andrea Carriero (Queen Mary, University of London); Massimiliano Marcellino (IEPBocconi University, IGIER and CEPR) 
Abstract:  In this paper we provide an overview of recent developments in the methodology for the construction of composite coincident and leading indexes, and apply them to the UK. In particular, we evaluate the relative merits of factor based models and Markov switching specifications for the construction of coincident and leading indexes. For the leading indexes we also evaluate the performance of probit models and pooling. The results indicate that alternative methods produce similar coincident indexes, while there are more marked di.erences in the leading indexes. 
Keywords:  Forecasting, Business cycles, Leading indicators, Coincident indicators, Turning points 
JEL:  E32 E37 C53 
Date:  2007–03 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp590&r=ecm 
By:  Razzak, Weshah 
Abstract:  I discuss econometric issues of high relevance to economists in central banks whose job is to interpret the permanency of shocks and provide policy advice to policymakers. Trend, unit root, and persistence are difficult to interpret. There are numerous econometric tests, which vary in their power and usefulness. I provide a set of strategies on dealing with macro time series. 
Keywords:  Unit root; trend; persistence; cointegration 
JEL:  F4 C22 C13 
Date:  2003 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:1970&r=ecm 
By:  López, Fernando; Chasco, Coro 
Abstract:  The purpose of this article is to analyze if spatial dependence is a synchronic effect in the firstorder spatial autoregressive model, SAR(1). Spatial dependence can be not only contemporary but also timelagged in many socioeconomic phenomena. In this paper, we use three Moranbased spacetime autocorrelation statistics to evaluate the simultaneity of this spatial effect. A simulation study shed some light upon these issues, demonstrating the capacity of these tests to identify the structure (only instant, only timelagged or both instant and timelagged) of spatial dependence in most cases. 
Keywords:  Spacetime dependence; Spatial autoregressive models; Moran’s I 
JEL:  C15 C51 C21 
Date:  2007–03–03 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:1985&r=ecm 
By:  Sumon Kumar Bhaumik (Brunel University); Ira N. Gang (Rutgers University); MyeongSu Yun (Tulane University) 
Abstract:  This paper decomposes differences in poverty incidence (head count ratio) using estimates from a regression equation, synthesizing the approaches proposed in World Bank (2003) and Yun (2004). A significance test is developed for characteristics and coefficients effects when decomposing differences in poverty incidence. The proposed method is implemented for studying differences in poverty incidence between Serbians and Albanians in Kosovo using Living Standard Measurement Survey. 
Keywords:  poverty incidence, decomposition, headcount, probit 
JEL:  C20 I30 
Date:  2006–12–01 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:200633&r=ecm 
By:  Figueiredo, Annibal; Gleria, Iram; Matsushita, Raul; Da Silva, Sergio 
Abstract:  This paper puts forward a technique based on the characteristic function to tackle the problem of the sum of stochastic variables. We consider independent processes whose reduced variables are identically distributed, including those that violate the conditions for the central limit theorem to hold. We also consider processes that are correlated and analyze the role of nonlinear autocorrelations in their convergence to a Gaussian. We demonstrate that nonidentity in independent processes is related to autocorrelations in nonindependent processes. We exemplify our approach with data from foreign exchange rates. 
Keywords:  econophysics; central limit theorem; characteristic function; reduced variables; autocorrelation 
JEL:  C1 
Date:  2006 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:1984&r=ecm 
By:  Nicholas Longford; D. B. Rubin 
Abstract:  We formulate performance assessment as a problem of causal analysis and outline an approach based on the missing data principle for its solution. It is particularly relevant in the context of socalled league tables for educational, healthcare and other publicservice institutions. The proposed solution avoids comparisons of institutions that have substantially different clientele (intake). 
Keywords:  Caliper matching, causal analysis, multiple imputation, nonignorable assignment, performance indicators, potential outcomes 
JEL:  C14 
Date:  2006–11 
URL:  http://d.repec.org/n?u=RePEc:upf:upfgen:994&r=ecm 
By:  Gernot Doppelhofer; Jesus Crespo Cuaresma 
Abstract:  We propose a Bayesian Averaging of Thresholds (BAT) approach for assessing the existence and quantifying the effect of threshold effects in cross country growth regressions in the presence of model uncertainty. The BAT method extends the Bayesian Averaging of Classical Estimates (BACE) approach proposed by SalaiMartin, Doppelhofer, and Miller (2004) by allowing for uncertainty over nonlinear threshold effects. We apply our method to a set of determinants of longterm economic growth in a cross section of 88 countries. Our results suggest that when model uncertainty is taken into account there is no evidence for robust threshold effects caused by the Initial Income, measured by GDP per capita in 1960, but that the Number of Years an Economy Has Been Open is an important source of nonlinear effects on growth. 
Keywords:  Model Uncertainty, Model Averaging, Threshold Estimation, NonLinearities, Determinants of Economic Growth 
JEL:  C11 C15 O20 O50 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:cam:camdae:0706&r=ecm 
By:  Urzúa, Carlos M. (Tecnológico de Monterrey, Campus Ciudad de México) 
Abstract:  Vector autoregressive models are often used in Macroeconomics to draw conclusions about the effects of policy innovations. However, those results depend on the researcher’s priors about the particular ordering of the variables. As an alternative, this paper presents a simple rule based on the Maximum Entropy principle that can be used to find the “most likely” ordering. The proposal is illustrated in the case of a VAR model of the U.S. economy. It is found that monetary policy shocks are better represented by innovations in the federal funds rate rather than by innovations in nonborrowed reserves. 
Keywords:  VAR, impulseresponse functions, varimin, maximum entropy, monetary policy shocks 
JEL:  C32 C51 E52 
Date:  2007–02 
URL:  http://d.repec.org/n?u=RePEc:ega:docume:200703&r=ecm 
By:  Bruce Mizrach (Rutgers University) 
Abstract:  This paper examines a variety of methods for extracting implied probability distributions from option prices and the underlying. The paper first explores nonparametric procedures for reconstructing densities directly from options market data. I then consider local volatility functions, both through implied volatility trees and volatility interpolation. I then turn to alternative specifications of the stochastic process for the underlying. I estimate a mixture of log normals model, apply it to exchange rate data, and illustrate how to conduct forecast comparisons. I finally turn to the estimation of jump risk by extracting bipower variation. 
Keywords:  options, implied probability densities, volatility smile, jump risk, bipower variation 
JEL:  G12 G14 F31 
Date:  2007–01–19 
URL:  http://d.repec.org/n?u=RePEc:rut:rutres:200702&r=ecm 
By:  Everts, Martin 
Abstract:  In the following article the ideal bandpass filter is derived and explained in order to subsequently analyze the approximations by Baxter and King (1999) and Christiano and Fitzgerald (2003). It can be shown that the filters by Baxter and King and Christiano and Fitzgerald primarily differ in two assumptions, namely in the assumption about the spectral density of the analyzed variables as well as in the assumption about the symmetry of the weights of the bandpass filter. In the article at hand it is shown that the different assumptions lead to characteristics for the two filters which distinguish in three points: in the accuracy of the approximation with respect to the length of the cycles considered, in the amount of calculable data points towards the ends of the data series, as well as in the removal of the trend of the original time series. 
Keywords:  Business Cycle; BandPass Filter 
JEL:  C1 E3 
Date:  2006–01 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:2049&r=ecm 