
on Econometric Time Series 
By:  Edoardo Otranto 
Abstract:  Financial time series are often characterized by similar volatility structures, often represented by GARCH processes. The detection of clusters of series displaying similar behavior could be important to understand the differences in the estimated processes, without having to study and compare the estimated parameters across all the series. This is particularly relevant dealing with many series, as in ﬁnancial applications. The volatility of a time series can be characterized in terms of the underlying GARCH process. Using Wald tests and the AR metrics to measure the distance between GARCH processes, it is possible to develop a clustering algorithm, which can provide three classiﬁcations (with increasing degree of deepness) based on the heteroskedastic patterns of the time series. The number of clusters is detected automatically and it is not ﬁxed a priori or a posteriori. The procedure is evaluated by simulations and applied to the sector indexes of the Italian market. 
Keywords:  Agglomerative algorithm, AR metrics, Cluster analysis, GARCH models, Wald test 
JEL:  C02 C19 C22 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:cns:cnscwp:200801&r=ets 
By:  Anindya Banerjee; Massimiliano Marcellino 
Abstract:  This paper brings together several important strands of the econometrics literature: errorcorrection, cointegration and dynamic factor models. It introduces the Factoraugmented Error Correction Model (FECM), where the factors estimated from a large set of variables in levels are jointly modelled with a few key economic variables of interest. With respect to the standard ECM, the FECM protects, at least in part, from omitted variable bias and the dependence of cointegration analysis on the specific limited set of variables under analysis. It may also be in some cases a refinement of the standard Dynamic Factor Model (DFM), since it allows us to include the error correction terms into the equations, and by allowing for cointegration prevent the errors from being noninvertible moving average processes. In addition, the FECM is a natural generalization of factor augmented VARs (FAVAR) considered by Bernanke, Boivin and Eliasz (2005) inter alia, which are specified in first differences and are therefore misspecified in the presence of cointegration. The FECM has a vast range of applicability. A set of Monte Carlo experiments and two detailed empirical examples highlight its merits in finite samples relative to standard ECM and FAVAR models. The analysis is conducted primarily within an insample framework, although the outofsample implications are also explored. 
Keywords:  Dynamic FactorModels, Error CorrectionModels, Cointegration, Factoraugmented Error Correction Models, VAR, FAVAR 
JEL:  C32 E17 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2008/15&r=ets 
By:  Massimiliano Marcellino; Christian Schumacher 
Abstract:  This paper compares different ways to estimate the current state of the economy using factor models that can handle unbalanced datasets. Due to the different release lags of business cycle indicators, data unbalancedness often emerges at the end of multivariate samples, which is sometimes referred to as the "ragged edge" of the data. Using a large monthly dataset of the German economy, we compare the performance of different factor models in the presence of the ragged edge: static and dynamic principal components based on realigned data, the ExpectationMaximisation (EM) algorithm and the Kalman smoother in a statespace model context. The monthly factors are used to estimate current quarter GDP, called the "nowcast", using different versions of what we call factorbased mixeddata sampling (FactorMIDAS) approaches. We compare all possible combinations of factor estimation methods and FactorMIDAS projections with respect to nowcast performance. Additionally, we compare the performance of the nowcast factor models with the performance of quarterly factor models based on timeaggregated and thus balanced data, which neglect the most timely observations of business cycle indicators at the end of the sample. Our empirical findings show that the factor estimation methods don't differ much with respect to nowcasting accuracy. Concerning the projections, the most parsimonious MIDAS projection performs best overall. Finally, quarterly models are in general outperformed by the nowcast factor models that can exploit raggededge data. 
Keywords:  nowcasting, business cycle, large factor models, mixedfrequency data, missing values, MIDAS 
JEL:  E37 C53 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2008/16&r=ets 
By:  Anindya Banerjee; Massimiliano Marcellino; Igor Masten 
Abstract:  We conduct a detailed simulation study of the forecasting performance of diffusion indexbased methods in short samples with structural change. We consider several data generation processes, to mimic different types of structural change, and compare the relative forecasting performance of factor models and more traditional time series methods. We find that changes in the loading structure of the factors into the variables of interest are extremely important in determining the performance of factor models. We complement the analysis with an empirical evaluation of forecasts for the key macroeconomic variables of the Euro area and Slovenia, for which relatively short samples are officially available and structural changes are likely. The results are coherent with the findings of the simulation exercise, and confirm the relatively good performance of factorbased forecasts also in short samples with structural change. 
Keywords:  Factor models, forecasts, time series models, structural change, short samples, parameter uncertainty 
JEL:  C53 C32 E37 
Date:  2008 
URL:  http://d.repec.org/n?u=RePEc:eui:euiwps:eco2008/17&r=ets 
By:  Tucker S. McElroy; Thomas M. Trimbur 
Abstract:  This paper sets out the theoretical foundations for continuoustime signal extraction in econometrics. Continuoustime modeling gives an effective strategy for treating stock and flow data, irregularly spaced data, and changing frequency of observation. We rigorously derive the optimal continuouslag filter when the signal component is nonstationary, and provide several illustrations, including a new class of continuouslag Butterworth filters for trend and cycle estimation. 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:200768&r=ets 
By:  Abdou Kâ Diongue (Université Gaston Berger et School of Economics and Finance); Dominique Guegan (Centre d'Economie de la Sorbonne et Paris School of Economics) 
Abstract:  In this paper, we discuss the parameter estimation for a kfactor generalized long memory process with conditionally heteroskedastic noise. Two estimation methods are proposed. The first method is based on the conditional distribution of the process and the second is obtained as an extension of Whittle's estimation approach. For comparison purposes, Monte Carlo simulations are used to evaluate the finite sample performance of these estimation techniques. 
Keywords:  Long memory, Gegenbauer polynomial, heteroskedasticity, conditional sum of squares, Whittle estimation. 
JEL:  C53 
Date:  2008–01 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:b08004&r=ets 
By:  Dominique Guegan (Centre d'Economie de la Sorbonne et Paris School of Economics) 
Abstract:  The detection of chaotic behaviors in commodities, stock markets and weather data is usually complicated by large noise perturbation inherent to the underlying system. It is well known, that predictions, from pure deterministic chaotic systems can be accurate mainly in the short term. Thus, it will be important to be able to reconstruct in a robust way the attractor in which evolves the data, if this attractor exists. In chaotic theory, the deconvolution methods have been largely studied and there exist different approaches which are competitive and complementary. In this work, we apply two methods : the singular value method and the wavelet approach. This last one has not been investigated a lot of filtering chaotic systems. Using very large Monte Carlo simulations, we show the ability of this last deconvolution method. Then, we use the denoised data set to do forecast, and we discuss deeply the possibility to do long term forecasts with chaotic systems. 
Keywords:  Deconvolution, chaos, SVD, state space method, Wavelets method. 
JEL:  C02 C32 C45 C53 
Date:  2008–01 
URL:  http://d.repec.org/n?u=RePEc:mse:cesdoc:b08008&r=ets 
By:  McCAUSLAND, William J.; MILLER, Shirley; PELLETIER, Denis 
Abstract:  We introduce a new method for drawing state variables in Gaussian state space models from their conditional distribution given parameters and observations. Unlike standard methods, our method does not involve Kalman filtering. We show that for some important cases, our method is computationally more efficient than standard methods in the literature. We consider two applications of our method. 
Keywords:  State sce models, Stochastic volatility, Count data 
Date:  2007 
URL:  http://d.repec.org/n?u=RePEc:mtl:montde:200706&r=ets 
By:  Ioannis Kasparis 
Abstract:  We examine the limit properties of the Nonlinear Least Squares (NLS) estimator under functional form misspecification in regression models with a unit root. Our theoretical framework is the same as that of Park and Phillips, Econometrica 2001. We show that the limit behaviour of the NLS estimator is largely determined by the relative order of magnitude of the true and fitted models. If the estimated model is of different order of magnitude than the true model, the estimator converges to boundary points. When the pseudotrue value is on a boundary, standard methods for obtaining rates of convergence and limit distribution results are not applicable. We provide convergence rates and limit distribution results, when the pseudotrue value is an interior point. If functional form misspecification is committed in the presence of stochastic trends, the convergence rates can be slower and the limit distribution different than that obtained under correct specification. 
Keywords:  Functional Form, Pseudotrue value, Unit root 
Date:  2008–02 
URL:  http://d.repec.org/n?u=RePEc:ucy:cypeua:22008&r=ets 
By:  Gregor W. Smith (Queen's University) 
Abstract:  The new Keynesian Phillips curve (NKPC) restricts multivariate forecasts. I estimate and test it entirely within a panel of professional forecasts, thus using the timeseries, crossforecaster, and crosshorizon dimensions of the panel. Estimation uses 13,193 observations on quarterly US inflation forecasts since 1981. The main finding is a significantly larger weight on expected future inflation than on past inflation, a finding which also is estimated with much more precision than in the standard approach. Inflation dynamics also are stable over time, with no decline in inflation inertia from the 1980s to the 2000s. But, as in historical data, identifying the output gap is difficult. 
Keywords:  forecast survey, new Keynesian Phillips curve 
JEL:  E31 E37 C23 
Date:  2008–02 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1155&r=ets 