nep-ets New Economics Papers
on Econometric Time Series
Issue of 2008‒02‒16
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. Clustering Heteroskedastic Time Series by Model-Based Procedures By Edoardo Otranto
  2. Factor-augmented Error Correction Models By Anindya Banerjee; Massimiliano Marcellino
  3. Factor-MIDAS for Now- and Forecasting with Ragged-Edge Data: A Model Comparison for German GDP By Massimiliano Marcellino; Christian Schumacher
  4. Forecasting Macroeconomic Variables Using Diffusion Indexes in Short Samples with Structural Change By Anindya Banerjee; Massimiliano Marcellino; Igor Masten
  5. Continuous time extraction of a nonstationary signal with illustrations in continuous low-pass and band-pass filtering By Tucker S. McElroy; Thomas M. Trimbur
  6. Estimation of k-factor GIGARCH process : a Monte Carlo study. By Abdou Kâ Diongue; Dominique Guegan
  7. Effect of noise filtering on predictions : on the routes of chaos. By Dominique Guegan
  8. A New Approach to Drawing States in State Space Models By McCAUSLAND, William J.; MILLER, Shirley; PELLETIER, Denis
  9. Functional Form Misspecification in Regressions with a Unit Root By Ioannis Kasparis
  10. US Inflation Dynamics 1981-2007: 13,193 Quarterly Observations By Gregor W. Smith

  1. By: Edoardo Otranto
    Abstract: Financial time series are often characterized by similar volatility structures, often represented by GARCH processes. The detection of clusters of series displaying similar behavior could be important to understand the differences in the estimated processes, without having to study and compare the estimated parameters across all the series. This is particularly relevant dealing with many series, as in financial applications. The volatility of a time series can be characterized in terms of the underlying GARCH process. Using Wald tests and the AR metrics to measure the distance between GARCH processes, it is possible to develop a clustering algorithm, which can provide three classifications (with increasing degree of deepness) based on the heteroskedastic patterns of the time series. The number of clusters is detected automatically and it is not fixed a priori or a posteriori. The procedure is evaluated by simulations and applied to the sector indexes of the Italian market.
    Keywords: Agglomerative algorithm, AR metrics, Cluster analysis, GARCH models, Wald test
    JEL: C02 C19 C22
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:cns:cnscwp:200801&r=ets
  2. By: Anindya Banerjee; Massimiliano Marcellino
    Abstract: This paper brings together several important strands of the econometrics literature: errorcorrection, cointegration and dynamic factor models. It introduces the Factor-augmented Error Correction Model (FECM), where the factors estimated from a large set of variables in levels are jointly modelled with a few key economic variables of interest. With respect to the standard ECM, the FECM protects, at least in part, from omitted variable bias and the dependence of cointegration analysis on the specific limited set of variables under analysis. It may also be in some cases a refinement of the standard Dynamic Factor Model (DFM), since it allows us to include the error correction terms into the equations, and by allowing for cointegration prevent the errors from being non-invertible moving average processes. In addition, the FECM is a natural generalization of factor augmented VARs (FAVAR) considered by Bernanke, Boivin and Eliasz (2005) inter alia, which are specified in first differences and are therefore misspecified in the presence of cointegration. The FECM has a vast range of applicability. A set of Monte Carlo experiments and two detailed empirical examples highlight its merits in finite samples relative to standard ECM and FAVAR models. The analysis is conducted primarily within an in-sample framework, although the out-of-sample implications are also explored.
    Keywords: Dynamic FactorModels, Error CorrectionModels, Cointegration, Factor-augmented Error Correction Models, VAR, FAVAR
    JEL: C32 E17
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2008/15&r=ets
  3. By: Massimiliano Marcellino; Christian Schumacher
    Abstract: This paper compares different ways to estimate the current state of the economy using factor models that can handle unbalanced datasets. Due to the different release lags of business cycle indicators, data unbalancedness often emerges at the end of multivariate samples, which is sometimes referred to as the "ragged edge" of the data. Using a large monthly dataset of the German economy, we compare the performance of different factor models in the presence of the ragged edge: static and dynamic principal components based on realigned data, the Expectation-Maximisation (EM) algorithm and the Kalman smoother in a state-space model context. The monthly factors are used to estimate current quarter GDP, called the "nowcast", using different versions of what we call factor-based mixed-data sampling (Factor-MIDAS) approaches. We compare all possible combinations of factor estimation methods and Factor-MIDAS projections with respect to now-cast performance. Additionally, we compare the performance of the nowcast factor models with the performance of quarterly factor models based on time-aggregated and thus balanced data, which neglect the most timely observations of business cycle indicators at the end of the sample. Our empirical findings show that the factor estimation methods don't differ much with respect to nowcasting accuracy. Concerning the projections, the most parsimonious MIDAS projection performs best overall. Finally, quarterly models are in general outperformed by the nowcast factor models that can exploit ragged-edge data.
    Keywords: nowcasting, business cycle, large factor models, mixed-frequency data, missing values, MIDAS
    JEL: E37 C53
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2008/16&r=ets
  4. By: Anindya Banerjee; Massimiliano Marcellino; Igor Masten
    Abstract: We conduct a detailed simulation study of the forecasting performance of diffusion index-based methods in short samples with structural change. We consider several data generation processes, to mimic different types of structural change, and compare the relative forecasting performance of factor models and more traditional time series methods. We find that changes in the loading structure of the factors into the variables of interest are extremely important in determining the performance of factor models. We complement the analysis with an empirical evaluation of forecasts for the key macroeconomic variables of the Euro area and Slovenia, for which relatively short samples are officially available and structural changes are likely. The results are coherent with the findings of the simulation exercise, and confirm the relatively good performance of factor-based forecasts also in short samples with structural change.
    Keywords: Factor models, forecasts, time series models, structural change, short samples, parameter uncertainty
    JEL: C53 C32 E37
    Date: 2008
    URL: http://d.repec.org/n?u=RePEc:eui:euiwps:eco2008/17&r=ets
  5. By: Tucker S. McElroy; Thomas M. Trimbur
    Abstract: This paper sets out the theoretical foundations for continuous-time signal extraction in econometrics. Continuous-time modeling gives an effective strategy for treating stock and flow data, irregularly spaced data, and changing frequency of observation. We rigorously derive the optimal continuous-lag filter when the signal component is nonstationary, and provide several illustrations, including a new class of continuous-lag Butterworth filters for trend and cycle estimation.
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2007-68&r=ets
  6. By: Abdou Kâ Diongue (Université Gaston Berger et School of Economics and Finance); Dominique Guegan (Centre d'Economie de la Sorbonne et Paris School of Economics)
    Abstract: In this paper, we discuss the parameter estimation for a k-factor generalized long memory process with conditionally heteroskedastic noise. Two estimation methods are proposed. The first method is based on the conditional distribution of the process and the second is obtained as an extension of Whittle's estimation approach. For comparison purposes, Monte Carlo simulations are used to evaluate the finite sample performance of these estimation techniques.
    Keywords: Long memory, Gegenbauer polynomial, heteroskedasticity, conditional sum of squares, Whittle estimation.
    JEL: C53
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:b08004&r=ets
  7. By: Dominique Guegan (Centre d'Economie de la Sorbonne et Paris School of Economics)
    Abstract: The detection of chaotic behaviors in commodities, stock markets and weather data is usually complicated by large noise perturbation inherent to the underlying system. It is well known, that predictions, from pure deterministic chaotic systems can be accurate mainly in the short term. Thus, it will be important to be able to reconstruct in a robust way the attractor in which evolves the data, if this attractor exists. In chaotic theory, the deconvolution methods have been largely studied and there exist different approaches which are competitive and complementary. In this work, we apply two methods : the singular value method and the wavelet approach. This last one has not been investigated a lot of filtering chaotic systems. Using very large Monte Carlo simulations, we show the ability of this last deconvolution method. Then, we use the de-noised data set to do forecast, and we discuss deeply the possibility to do long term forecasts with chaotic systems.
    Keywords: Deconvolution, chaos, SVD, state space method, Wavelets method.
    JEL: C02 C32 C45 C53
    Date: 2008–01
    URL: http://d.repec.org/n?u=RePEc:mse:cesdoc:b08008&r=ets
  8. By: McCAUSLAND, William J.; MILLER, Shirley; PELLETIER, Denis
    Abstract: We introduce a new method for drawing state variables in Gaussian state space models from their conditional distribution given parameters and observations. Unlike standard methods, our method does not involve Kalman filtering. We show that for some important cases, our method is computationally more efficient than standard methods in the literature. We consider two applications of our method.
    Keywords: State sce models, Stochastic volatility, Count data
    Date: 2007
    URL: http://d.repec.org/n?u=RePEc:mtl:montde:2007-06&r=ets
  9. By: Ioannis Kasparis
    Abstract: We examine the limit properties of the Non-linear Least Squares (NLS) estimator under functional form misspecification in regression models with a unit root. Our theoretical framework is the same as that of Park and Phillips, Econometrica 2001. We show that the limit behaviour of the NLS estimator is largely determined by the relative order of magnitude of the true and fitted models. If the estimated model is of different order of magnitude than the true model, the estimator converges to boundary points. When the pseudo-true value is on a boundary, standard methods for obtaining rates of convergence and limit distribution results are not applicable. We provide convergence rates and limit distribution results, when the pseudo-true value is an interior point. If functional form misspecification is committed in the presence of stochastic trends, the convergence rates can be slower and the limit distribution different than that obtained under correct specification.
    Keywords: Functional Form, Pseudo-true value, Unit root
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:ucy:cypeua:2-2008&r=ets
  10. By: Gregor W. Smith (Queen's University)
    Abstract: The new Keynesian Phillips curve (NKPC) restricts multivariate forecasts. I estimate and test it entirely within a panel of professional forecasts, thus using the time-series, cross-forecaster, and cross-horizon dimensions of the panel. Estimation uses 13,193 observations on quarterly US inflation forecasts since 1981. The main finding is a significantly larger weight on expected future inflation than on past inflation, a finding which also is estimated with much more precision than in the standard approach. Inflation dynamics also are stable over time, with no decline in inflation inertia from the 1980s to the 2000s. But, as in historical data, identifying the output gap is difficult.
    Keywords: forecast survey, new Keynesian Phillips curve
    JEL: E31 E37 C23
    Date: 2008–02
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1155&r=ets

This nep-ets issue is ©2008 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.