
on Econometric Time Series 
By:  Anthony Garratt (School of Economics, Mathematics & Statistics, Birkbeck College); Donald Robertson; Stephen Wright (School of Economics, Mathematics & Statistics, Birkbeck College) 
Abstract:  Any nonstationary series can be decomposed into permanent (or "trend") and transitory (or "cycle") components. Typically some atheoretic prefiltering procedure is applied to extract the permanent component. This paper argues that analysis of the fundamental underlying stationary economic processes should instead be central to this process. We present a new derivation of multivariate BeveridgeNelson permanent and transitory components, whereby the latter can be derived explicitly as a weighting of observable stationary processes. This allows far clearer economic interpretations. Different assumptions on the fundamental stationary processes result in distinctly different results; but this reflects deep economic uncertainty. We illustrate with an example using Garratt et al's (2003a) small VECM model of the UK economy. Any nonstationary series can be decomposed into permanent (or "trend") and transitory (or "cycle") components. Typically some atheoretic prefiltering procedure is applied to extract the permanent component. This paper argues that analysis of the fundamental underlying stationary economic processes should instead be central to this process. We present a new derivation of multivariate BeveridgeNelson permanent and transitory components, whereby the latter can be derived explicitly as a weighting of observable stationary processes. This allows far clearer economic interpretations. Different assumptions on the fundamental stationary processes result in distinctly different results; but this reflects deep economic uncertainty. We illustrate with an example using Garratt et al's (2003a) small VECM model of the UK economy. 
Keywords:  Multivariate BeveridgeNelson, VECM, Economic Fundamentals, Decomposition. 
JEL:  C1 C32 E0 E32 E37 
Date:  2005–02 
URL:  http://d.repec.org/n?u=RePEc:bbk:bbkefp:0412&r=ets 
By:  Anthony Garratt (School of Economics, Mathematics & Statistics, Birkbeck College); Shaun P Vahey 
Abstract:  We characterise the relationships between preliminary and subsequent measurements for 16 commonlyused UK macroeconomic indicators drawn from two existing realtime data sets and a new nominal variable database. Most preliminary measurements are biased predictors of subsequent measurements, with some revision series affected by multiple structural breaks. To illustrate how these findings facilitate realtime forecasting, we use a vector autoregression to generate realtime onestepahead probability event forecasts for 1990Q1 to 1999Q2. Ignoring the predictability in initial measurements understates considerably the probability of above trend output growth. 
Keywords:  realtime data, structural breaks, probability event forecasts 
JEL:  C22 C82 E00 
Date:  2005–02 
URL:  http://d.repec.org/n?u=RePEc:bbk:bbkefp:0413&r=ets 
By:  JeanMarie Dufour 
Abstract:  The technique of Monte Carlo (MC) tests [Dwass (1957), Barnard (1963)] provides an attractive method of building exact tests from statistics whose finite sample distribution is intractable but can be simulated (provided it does not involve nuisance parameters). We extend this method in two ways: first, by allowing for MC tests based on exchangeable possibly discrete test statistics; second, by generalizing the method to statistics whose null distributions involve nuisance parameters (maximized MC tests, MMC). Simplified asymptotically justified versions of the MMC method are also proposed and it is shown that they provide a simple way of improving standard asymptotics and dealing with nonstandard asymptotics (e.g., unit root asymptotics). Parametric bootstrap tests may be interpreted as a simplified version of the MMC method (without the general validity properties of the latter). <P>La technique des tests de Monte Carlo ((MC; Dwass (1957), Barnard (1963)) constitue une méthode attrayante qui permet de construire des tests exacts fondés sur des statistiques dont la distribution exacte est difficile à calculer par des méthodes analytiques mais peut être simulée, pourvu que cette distribution ne dépende pas de paramètres de nuisance. Nous généralisons cette méthode dans deux directions: premièrement, en considérant le cas où le test de Monte Carlo est construit à partir de réplications échangeables d’une variable aléatoire dont la distribution peut comporter des discontinuités; deuxièmement, en étendant la méthode à des statistiques dont la distribution dépend de paramètres de nuisance (tests de Monte Carlo maximisés, MMC). Nous proposons aussi des versions simplifiées de la procédure MMC, qui ne sont valides qu’asymptotiquement mais fournissent néanmoins une méthode simple qui permet d’améliorer les approximations asymptotiques usuelles, en particulier dans des cas non standards (e.g., l’asymptotique en présence de racines unitaires). Nous montrons aussi que les tests basés sur la technique du bootstrap paramétrique peut s’interpréter comme une version simplifiée de la procédure MMC. Cette dernière fournit toutefois des tests asymptotiquement valides sous des conditions beaucoup plus générales que le bootstrap paramétrique. 
Keywords:  Monte Carlo test, maximized monte Carlo test, finite sample test, exact test, nuisance parameter, bounds, bootstrap, parametric bootstrap, simulated annealing, asymptotics, nonstandard asymptotic distribution, test de Monte Carlo, test de Monte Carlo maximisé, test exact, test valide en échantillon fini, paramètre de nuisance, bornes, bootstrap, bootstrap paramétrique, recuit simulé, distribution asymptotique non standard 
JEL:  C12 C15 C2 C52 C22 
Date:  2005–02–01 
URL:  http://d.repec.org/n?u=RePEc:cir:cirwor:2005s02&r=ets 
By:  MarieClaude Beaulieu; JeanMarie Dufour; Lynda Khalaf 
Abstract:  In this paper, we propose exact inference procedures for asset pricing models that can be formulated in the framework of a multivariate linear regression (CAPM), allowing for stable error distributions. The normality assumption on the distribution of stock returns is usually rejected in empirical studies, due to excess kurtosis and asymmetry. To model such data, we propose a comprehensive statistical approach which allows for alternative  possibly asymmetric  heavy tailed distributions without the use of largesample approximations. The methods suggested are based on Monte Carlo test techniques. Goodnessoffit tests are formally incorporated to ensure that the error distributions considered are empirically sustainable, from which exact confidence sets for the unknown tail area and asymmetry parameters of the stable error distribution are derived. Tests for the efficiency of the market portfolio (zero intercepts) which explicitly allow for the presence of (unknown) nuisance parameter in the stable error distribution are derived. The methods proposed are applied to monthly returns on 12 portfolios of the New York Stock Exchange over the period 19261995 (5 year subperiods). We find that stable possibly skewed distributions provide statistically significant improvement in goodnessoffit and lead to fewer rejections of the efficiency hypothesis. <P>Dans cet article, nous proposons des méthodes d’inférence exactes pour des modèles d’évaluation d’actifs (CAPM) qui sont formulés dans le contexte des modèles de régression linéaires multivariés. De plus, ces méthodes permettent de considérer des lois de probabilité stables sur les erreurs du modèle. Il est bien connu que l’hypothèse de normalité des rendements boursiers est habituellement rejetée dans les études empiriques à cause de la présence d’asymétrie et d’aplatissement dans les distributions. Afin de modéliser de tels attributs, nous suggérons une approche qui accommode l’asymétrie et l’aplatissement dans les distributions sans avoir recours à des approximations de grands échantillons. Les méthodes suggérées sont basées sur des tests de Monte Carlo. Des tests diagnostiques multivariés sont formellement inclus dans l’analyse afin de s’assurer que les distributions d’erreurs considérées sont raisonnables pour les données étudiées. Ces tests permettent la construction de régions de confiance exactes pour les paramètres d’asymétrie et d’aplatissement des erreurs dans le cas de lois stables. Nous proposons des tests d’efficacité du portefeuille de référence (i.e., pour la nullité des constantes) qui tiennent explicitement compte de la présence de paramètres de nuisance dans les distributions stables. Les méthodes proposées sont appliquées aux rendements de 12 portefeuilles constitués d’actifs négociés à la bourse de New York (NYSE) sur la période s’étalant de 1926 à 1995 (par souspériodes de cinq ans). Nos résultats montrent que l’utilisation de distributions stables possiblement asymétriques produit une amélioration statistique importante dans la représentation de la distribution et mène à moins de rejet de l’hypothèse d’efficacité du portefeuille de marché. 
Keywords:  capital asset pricing model; meanvariance efficiency; nonnormality; multivariate linear regression; stable distribution; skewness; kurtosis; asymmetry; uniform linear hypothesis; exact test; Monte Carlo test; nuisance parameter; specification test; diagnostics, modèle d’évaluation d’actifs financiers; efficience de portefeuille; nonnormalité; modèle de régression multivarié; loi stable; asymétrie; aplatissement; hypothèse linéaire uniforme; test exact; test de Monte Carlo; paramètres de nuisance; tests diagnostiques 
JEL:  C3 C12 C33 C15 G1 G12 G14 
Date:  2005–02–01 
URL:  http://d.repec.org/n?u=RePEc:cir:cirwor:2005s03&r=ets 
By:  Angelini, Elena; Henry, Jérôme; Marcellino, Massimiliano 
Abstract:  Existing methods for data interpolation or backdating are either univariate or based on a very limited number of series, due to data and computing constraints that were binding until the recent past. Nowadays large datasets are readily available, and models with hundreds of parameters are fastly estimated. We model these large datasets with a factor model, and develop an interpolation method that exploits the estimated factors as an efficient summary of all the available information. The method is compared with existing standard approaches from a theoretical point of view, by means of Monte Carlo simulations, and also when applied to actual macroeconomic series. The results indicate that our method is more robust to model misspecification, although traditional multivariate methods also work well while univariate approaches are systematically outperformed. When interpolated series are subsequently used in econometric analyses, biases can emerge, depending on the type of interpolation but again be reduced with multivariate approaches, including factorbased ones. 
Keywords:  Factor model; Interpolation; Kalman filter; spline 
JEL:  C32 C43 C82 
Date:  2004–10 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:4533&r=ets 
By:  Pesavento, Elena; Rossi, Barbara 
Abstract:  Existing methods for constructing confidence bands for multivariate impulse response functions depend on auxiliary assumptions on the order of integration of the variables. Thus, they may have poor coverage at long lead times when variables are highly persistent. Solutions that have been proposed in the literature may be computationally challenging. The goal of this Paper is to propose a simple method for constructing confidence bands for impulse response functions that is not pointwise and that is robust to the presence of highly persistent processes. The method uses alternative approximations based on localtounity asymptotic theory and allows the lead time of the impulse response function to be a fixed fraction of the sample size. These devices provide better approximations in small samples. Monte Carlo simulations show that our method tends to have better coverage properties at long horizons than existing methods. We also investigate the properties of the various methods in terms of the length of their confidence bands. Finally, we show, with empirical applications, that our method may provide different economic interpretations of the data. Applications to real GDP and to nominal versus real sources of fluctuations in exchange rates are discussed. 
Keywords:  impulse response functions; local to unity asymptotics; persistence; VARs 
JEL:  C12 C32 F40 
Date:  2004–09 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:4536&r=ets 
By:  Pesaran, M Hashem; Pettenuzzo, Davide; Timmermann, Allan G 
Abstract:  This Paper provides a novel approach to forecasting time series subject to discrete structural breaks. We propose a Bayesian estimation and prediction procedure that allows for the possibility of new breaks over the forecast horizon, taking account of the size and duration of past breaks (if any) by means of a hierarchical hidden Markov chain model. Predictions are formed by integrating over the hyper parameters from the meta distributions that characterize the stochastic break point process. In an application to US Treasury bill rates, we find that the method leads to better outofsample forecasts than alternative methods that ignore breaks, particularly at long horizons. 
Keywords:  Bayesian model averaging; forecasting; hierarchical hidden Markov Chain Model; structural breaks 
JEL:  C11 C15 C53 
Date:  2004–09 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:4636&r=ets 
By:  Elliott, Graham; Timmermann, Allan G 
Abstract:  This Paper proposes a new forecast combination method that lets the combination weights be driven by regime switching in a latent state variable. An empirical application that combines forecasts from survey data and time series models finds that the proposed regime switching combination scheme performs well for a variety of macroeconomic variables. Monte Carlo simulations shed light on the type of data generating processes for which the proposed combination method can be expected to perform better than a range of alternative combination schemes. Finally, we show how timevariations in the combination weights arise when the target variable and the predictors share a common factor structure driven by a hidden Markov process. 
Keywords:  forecast combination; Markov switching; survey data; timevarying combination weights 
JEL:  C53 
Date:  2004–10 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:4649&r=ets 
By:  FrühwirthSchnatter, Sylvia; Kaufmann, Sylvia 
Abstract:  We propose to use the attractiveness of pooling relatively short time series that display similar dynamics, but without restricting to pooling all into one group. We suggest estimating the appropriate grouping of time series simultaneously along with the groupspecific model parameters. We cast estimation into the Bayesian framework and use Markov chain Monte Carlo simulation methods. We discuss model identification and base model selection on marginal likelihoods. A simulation study documents the efficiency gains in estimation and forecasting that are realized when appropriately grouping the time series of a panel. Two economic applications illustrate the usefulness of the method in analysing also extensions to Markov switching within clusters and heterogeneity within clusters, respectively. 
Keywords:  clustering; Markov chain Monte Carlo; Markov Switching; mixture modelling; panel data 
JEL:  C11 C33 E32 
Date:  2004–09 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:4650&r=ets 
By:  Kandel, Shmuel; Zilca, Shlomo 
Abstract:  Cochrane’s variance ratio is a leading tool for detection of deviations from random walks in financial asset prices. This Paper develops a variance ratio related regression model that can be used for prediction. We suggest a comprehensive framework for our model, including model identification, model estimation and selection, bias correction, model diagnostic check, and an inference procedure. We use our model to study and model mean reversion in the NYSE index in the period 18252002. We demonstrate that in addition to mean reversion, our model can generate other characteristic properties of financial asset prices, such as shortterm persistence and volatility clustering of unconditional returns. 
Keywords:  mean reversion; persistence; variance ratio 
Date:  2004–11 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:4729&r=ets 
By:  Konstantin A., KHOLODILIN; Wension Vincent, YAO 
Abstract:  This papers develops a dynamic factor models with regime switching to account for the decreasing volatility of the U.S. economy observed since the mid1980s. Apart from the Markov switching capturing the cyclical fluctuations, an additional type of regime switching is introduced to allow variances to switch between distinct regimes. The resulting fourregime models extend univariate analysis currently used in the literature on the structural break in conditional volatility to the multivariate time series. Besides the dynamic factor model using the data with a single (monthly) frequency, we employ the additonal information incorporating the mixedfrequency data, which include not only the monthly component series but also such an important quarterly series as the real GDP. The evaluation of six different nonlinear models suggests that the probabilities derived from all the models comply with NBER business cycle dating and detect a onetime shifting from high variance to lowvariance states in February 1984. In addition, we find that: mixedfrequency models outperform singlefrequency models; restricted models outperform unrestricted models; fourregime switching models outperform tworegime switching models. 
Keywords:  Volatility; Structural break; Composite coincident indicator; Dynamic factor model; Markov switching; Mixedfrequency data 
JEL:  E32 C10 
Date:  2004–09–15 
URL:  http://d.repec.org/n?u=RePEc:ctl:louvir:2004024&r=ets 
By:  Campagnoli Patrizia (University of Pavia, Italy); Muliere Pietro (University of Bocconi, Italy); Petrone Sonia (Department of Economics, University of Insubria, Italy) 
Abstract:  In this paper we consider a class of conditionally Gaussian state space models and discuss how they can provide a flexible and fairly simple tool for modelling financial time series, even in presence of different components in the series, or of stochastic volatility. Estimation can be computed by recursive equations, which provide the optimal solution under rather mild assumptions. In more general models, the filter equations can still provide approximate solutions. We also discuss how some models traditionally employed for analysing financial time series can be regarded in the statespace framework. Finally, we illustrate the models in two examples to real data sets. 
Keywords:  dynamic linear models; conditionally gaussian models; Kalman filter; stochastic regressors; stochastic volatility; GARcH models. 
URL:  http://d.repec.org/n?u=RePEc:ins:quaeco:qf0003&r=ets 
By:  Paruolo Paolo (Department of Economics, University of Insubria, Italy) 
Abstract:  This paper condiders likelihood ratio (LR) cointegration rank tests in vector autoregressive models (VAR); the local power of the most widely used LR 'trace' test is compared with the LR 'lambda max' test. It is found that neither test uniformily dominates the other one. Moreover it is shown that the asymptotic properties of the estimator of the cointegration rank based on the trace test are shared by a similar estimator based on the lambda max test. These results indicate that the both tests are admissible. 
Keywords:  Cointegration, Likelihood Ratio, Unit roots, Local Power 
URL:  http://d.repec.org/n?u=RePEc:ins:quaeco:qf0004&r=ets 
By:  Mira Antonietta (Department of Economics, University of Insubria, Italy) 
Abstract:  The class of MetropolisHastings algorithms can be modified by delaying the rejection of proposed moves. The new samplers are proved to perform better than the original ones in terms of asymptotic variance of the estimates on a sweep by sweep basis. The delaying rejection algorithms also allow some space for local adaptation of the proposal distribution. We give an iterative formula for the acceptance probability at the ith iteration of the delaying process. A special case is discussed in detail: the delaying rejection algorithm with symmetric proposal distribution 
Keywords:  Markov chain Monte Carlo Methods, MetropolisHastings algorithm, Asymptotic variance, Peskun ordering 
URL:  http://d.repec.org/n?u=RePEc:ins:quaeco:qf0005&r=ets 
By:  Paruolo Paolo (Department of Economics, University of Insubria, Italy) 
Abstract:  This paper provides asymptotic standard errors for the moving average (MA) impact matrix for the second differences of a vector autoregressive (VAR) process integrated of order 2,I(2). Standard errors of the row space of the MA impact matrix are also provided; bases of this row space define the common I(2) trends linear combinations. These standard errors are then used to formulate Wald type tests. The MA impact matrix is shown to be linked to impact factors which measure the total effect of disequilibrium errors on the growth rate of the system. Most of the relevant limit distributions are Gaussian, and we report artificial regressions that can be used to calculate the estimators of the asymptotic variances. The use of the techniques proposed in the paper are illustrated on UK money data. 
Keywords:  Cointegration, Common trends, VAR, I(2), ML, 2SI2 
URL:  http://d.repec.org/n?u=RePEc:ins:quaeco:qf0007&r=ets 
By:  Paruolo Paolo (Department of Economics, University of Insubria, Italy) 
Abstract:  This paper considers the asymptotic analysis of the likelihood ratio (LR), cointegration (CI) rank test in vector autoregressive models (VAR) when some CI vectors are known and fixed. It is shown that the limit law is free of nuisance parameters. In the case of LR tests against the alternative of completely unrestricted CI space, the limit law can be expressed as the convolution of known distributions. This deconvolution is employed to approximate the quantiles of the distribution, without resorting to new simulations. 
Keywords:  cointegration, likelihood ratio, unit roots 
URL:  http://d.repec.org/n?u=RePEc:ins:quaeco:qf0106&r=ets 
By:  Paruolo Paolo (Department of Economics, University of Insubria, Italy) 
Abstract:  This paper derives standard errors for Monte Carlo (MC) estimators of (relative) power of tests when the critical values under the null have also been estimated. This situation is common e.g. in unit root and cointegration tests. The associated issue of MC design is discussed. The results are illustrated on likelihood based tests for cointegration rank determination. 
Keywords:  Monte Carlo, design of experiments, (local) power, cointegration, likelihood ratio, unit roots 
URL:  http://d.repec.org/n?u=RePEc:ins:quaeco:qf0112&r=ets 
By:  Todd E. Clark; Kenneth D. West 
Abstract:  We consider using outofsample mean squared prediction errors (MSPEs) to evaluate the null that a given series follows a zero mean martingale difference against the alternative that it is linearly predictable. Under the null of no predictability, the population MSPE of the null %u201Cno change%u201D model equals that of the linear alternative. We show analytically and via simulations that despite this equality, the alternative model%u2019s sample MSPE is expected to be greater than the null%u2019s. For rolling regression estimators of the alternative model%u2019s parameters, we propose and evaluate an asymptotically normal test that properly accounts for the upward shift of the sample MSPE of the alternative model. Our simulations indicate that our proposed procedure works well. 
JEL:  C22 C53 
Date:  2005–01 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberte:0305&r=ets 
By:  JeanPhilippe Bouchaud (Science & Finance, Capital Fund Management; CEA Saclay;); Josep Perello; Jaume Masoliver 
Abstract:  Financial time series exhibit two different type of non linear correlations: (i) volatility autocorrelations that have a very long range memory, on the order of years, and (ii) asymmetric returnvolatility (or `leverage') correlations that are much shorter ranged. Different stochastic volatility models have been proposed in the past to account for both these correlations. However, in these models, the decay of the correlations is exponential, with a single time scale for both the volatility and the leverage correlations, at variance with observations. We extend the linear OrnsteinUhlenbeck stochastic volatility model by assuming that the mean reverting level is itself random. We find that the resulting threedimensional diffusion process can account for different correlation time scales. We show that the results are in good agreement with a century of the Dow Jones index daily returns (19002000), with the exception of crash days. 
JEL:  G10 
URL:  http://d.repec.org/n?u=RePEc:sfi:sfiwpa:50001&r=ets 
By:  JeanPhilippe Bouchaud (Science & Finance, Capital Fund Management; CEA Saclay;); Marc Mezard (Universite Paris Sud (Orsay)); Irene Giardina 
Abstract:  We propose a general interpretation for longrange correlation effects in the activity and volatility of financial markets. This interpretation is based on the fact that the choice between `active' and `inactive' strategies is subordinated to randomwalk like processes. We numerically demonstrate our scenario in the framework of simplified market models, such as the Minority Game model with an inactive strategy, or a more sophisticated version that includes some price dynamics. We show that real market data can be surprisingly well accounted for by these simple models. 
JEL:  G10 
URL:  http://d.repec.org/n?u=RePEc:sfi:sfiwpa:500024&r=ets 
By:  Atsushi Inoue (North Carolina State University); Mototsugu Shintani (Department of Economics, Vanderbilt University) 
Abstract:  This paper establishes that the bootstrap provides asymptotic refinements for the generalized method of moments estimator of overidentified linear models when autocorrelation structures of moment functions are unknown. When moment functions are uncorrelated after finite lags, Hall and Horowitz (1996) showed that errors in the rejection probabilities of the symmetrical t test and the test of overidentifying restrictions based on the bootstrap are O(T1). In general, however, such a parametric rate cannot be obtained with the heteroskedasticity and autocorrelation consistent (HAC) covariance matrix estimator since it converges at a nonparametric rate that is slower than T1/2. By taking into account the HAC covariance matrix estimator in the Edgeworth expansion, we show that the bootstrap provides asymptotic refinements when kernels whose characteristic exponent is greater than two are used. Moreover, we find that the order of the bootstrap approximation error can be made arbitrarily close to o(T1) provided moment conditions are satisfied. The bootstrap approximation thus improves upon the firstorder asymptotic approximation even when there is a general autocorrelation. A Monte Carlo experiment shows that the bootstrap improves the accuracy of inference on regression parameters in small samples. We apply our bootstrap method to inference about the parameters in the monetary policy reaction function. 
Keywords:  Asymptotic refinements, block bootstrap, dependent data, Edgeworth expansions, instrumental variables 
JEL:  C12 C22 C32 
Date:  2001–12 
URL:  http://d.repec.org/n?u=RePEc:van:wpaper:0129&r=ets 
By:  Xiaohong Chen (Department of Economics, New York University); Yanqin Fan (Department of Ecomomics, Vanderbilt University) 
Abstract:  In this paper, we develop a general approach for constructing simple tests for the correct density forecasts, or equivalently, for i.i.d. uniformity of appropriately transformed random variables. It is based on nesting a series of i.i.d. uniform random variables into a class of copulabased stationary Markov processes. As such, it can be used to test for i.i.d. uniformity against alternative processes that exhibit a wide variety of marginal properties and temporal dependence properties, including skewed and fattailed marginal distributions, asymmetric dependence, and positive tail dependence. In addition, we develop tests for the dependence structure of the forecasting model that are robust to possible misspecification of the marginal distribution. 
Keywords:  Density forecasts, Gaussian copula, probability integral transform, nonlinear time series 
JEL:  C22 C52 C53 
Date:  2002–10 
URL:  http://d.repec.org/n?u=RePEc:van:wpaper:0225&r=ets 