nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒07‒13
thirteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Automated model selection in finance: General-to-specic modelling of the mean and volatility specications By Alvaro Escribano; Genaro Sucarrat
  2. Bootstrap LR tests of stationarity, common trends and cointegration By Fabio Busetti; Silvestro di Sanzo
  3. Bayesian Adaptive Hamiltonian Monte Carlo with an Application to High-Dimensional BEKK GARCH Models By Martin Burda; John Maheu
  4. Common Correlated Effects Estimation of Dynamic Panels with Cross-Sectional Dependence By T. DE GROOTE; G. EVERAERT
  5. Band Spectrum Regressions using Wavelet Analysis By Andersson, Fredrik N. G.
  6. Cycle Extraction: A Comparison of the Phase-Average Trend Method, the Hodrick-Prescott and Christiano-Fitzgerald Filters By Ronny Nilsson; Gyorgy Gyomai
  7. Limit Theorems for Empirical Processes Based on Dependent Data By Patrizia Berti; Luca Pratelli; Pietro Rigo
  8. Ranking Multivariate GARCH Models by Problem Dimension: An Empirical Evaluation By Caporin, M.; McAleer, M.J.
  9. Monte Carlo Maximum Likelihood Estimation for Generalized Long-Memory Time Series Models By Geert Mesters; Siem Jan Koopman; Marius Ooms
  10. Sufficient information in structural VARs By Mario Forni; Luca Gambetti
  11. A goodnes-of-fit test based on ranks for arma models. By Ferretti, Nélida E.; Kelmansky, Diana M.; Yohai, Victor J.
  12. Interpolation, outliers and inverse autocorrelations. By Peña, Daniel; Maravall, Agustín
  13. Testing for Multivariate Cointegration in the Presence of Structural Breaks: p-Values and Critical Values By David E. Giles; Ryan T. Godwin

  1. By: Alvaro Escribano (Universidad Carlos III de Madrid); Genaro Sucarrat (BI Norwegian School of Management)
    Abstract: General-to-Specific (GETS) modelling has witnessed major advances over the last decade thanks to the automation of multi-path GETS specification search. However, several scholars have argued that the estimation complexity associated with financial models constitutes an obstacle to multi-path GETS modelling in finance. Making use of a recent result on log-GARCH Models, we provide and study simple but general and flexible methods that automate financial multi-path GETS modelling. Starting from a general model where the mean specification can contain autoregressive (AR) terms and explanatory variables, and where the exponential volatility specification can include log-ARCH terms, asymmetry terms, volatility proxies and other explanatory variables, the algorithm we propose returns parsimonious mean and volatility specifications. The finite sample properties of the methods are studied by means of extensive Monte Carlo simulations, and two empirical applications suggest the methods are very useful in practice.
    Keywords: general-to-specific; specification search; model selection; finance; volatility
    JEL: C32 C51 C52 E44
    Date: 2011–06–23
  2. By: Fabio Busetti (Bank of Italy); Silvestro di Sanzo (Confcommercio)
    Abstract: The paper considers likelihood ratio (LR) tests of stationarity, common trends and cointegration for multivariate time series. As the distribution of these tests is not known, a bootstrap version is proposed via a state space representation. The bootstrap samples are obtained from the Kalman filter innovations under the null hypothesis. Monte Carlo simulations for the Gaussian univariate random walk plus noise model show that the bootstrap LR test achieves higher power for medium-sized deviations from the null hypothesis than a locally optimal and one-sided LM test, that has a known asymptotic distribution. The power gains of the bootstrap LR test are significantly larger for testing the hypothesis of common trends and cointegration in multivariate time series, as the alternative asymptotic procedure -obtained as an extension of the LM test of stationarity- does not possess properties of optimality. Finally, it is showed that the (pseudo) LR tests maintain good size and power properties also for non-Gaussian series. As an empirical illustration, we find evidence of two common stochastic trends in the volatility of the US dollar exchange rate against european and asian/pacific currencies.
    Keywords: Kalman filter, state-space models, unit roots
    JEL: C12 C22
    Date: 2011–03
  3. By: Martin Burda; John Maheu
    Abstract: Hamiltonian Monte Carlo (HMC) is a recent statistical procedure to sample from complex distributions. Distant proposal draws are taken in a equence of steps following the Hamiltonian dynamics of the underlying parameter space, often yielding superior mixing properties of the resulting Markov chain. However, its performance can deteriorate sharply with the degree of irregularity of the underlying likelihood due to its lack of local adaptability in the parameter space. Riemann Manifold HMC (RMHMC), a locally adaptive version of HMC, alleviates this problem, but at a substantially increased computational cost that can become prohibitive in high-dimensional scenarios. In this paper we propose the Adaptive HMC (AHMC), an alternative inferential method based on HMC that is both fast and locally adaptive, combining the advantages of both HMC and RMHMC. The benefits become more pronounced with higher dimensionality of the parameter space and with the degree of irregularity of the underlying likelihood surface. We show that AHMC satisfies detailed balance for a valid MCMC scheme and provide a comparison with RMHMC in terms of effective sample size, highlighting substantial efficiency gains of AHMC. Simulation examples and an application of the BEKK GARCH model show the usefulness of the new posterior sampler.
    Keywords: High-dimensional joint sampling; Markov chain Monte Carlo; Multivariate GARCH
    JEL: C01 C11 C15 C32
    Date: 2011–06–21
    Abstract: We study estimation of dynamic panel data models with error cross-sectional dependence generated by an unobserved common factor. We show that for a temporally dependent factor, the standard within groups (WG) estimator is inconsistent even as both N and T tend to innity. Next we investigate the properties of the common correlated effects pooled (CCEP) estimator of Pesaran [Econometrica, 2006] which eliminates the cross-sectional dependence using cross-sectional averages of the data. In contrast to the static case, the CCEP estimator is only consistent if next to N also T tends to innity. It is shown that for the most relevant parameter settings, the asymptotic bias of the CCEP estimator is larger than that of the infeasible WG estimator, which includes the common factors as regressors. Restricting the CCEP estimator results in a somewhat smaller asymptotic bias. The small sample proper- ties of the various estimators are analysed using Monte Carlo experiments. The simulation results suggest that the CCEP estimator can be used to estimate dynamic panel data models provided T is not too small. The size of N is of less importance.
    Keywords: Cross-Sectional Dependence; Dynamic Panel; Common Correlated Effects
    JEL: C13 C15 C23
    Date: 2011–06
  5. By: Andersson, Fredrik N. G. (Department of Economics, Lund University)
    Abstract: In economics it is common to distinguish between different time horizons (i.e. short run, medium run, and long run). Engle (1974) proposed combining the discrete Fourier transform with a band spectrum regression to estimate models that separates between different time horizons. In this paper we discuss possibilities and challenges using the maximal overlap discrete wavelet transform instead of the Fourier transform when estimating band spectrum regressions.
    Keywords: band spectrum regression; wavelet transform; frequency domain; economic modeling
    JEL: C14 C32 C51
    Date: 2011–06–29
  6. By: Ronny Nilsson; Gyorgy Gyomai
    Abstract: This paper reports on revision properties of different de-trending and smoothing methods (cycle estimation methods), including PAT with MCD smoothing, a double Hodrick-Prescott (HP) filter and the Christiano-Fitzgerald (CF) filter. The different cycle estimation methods are rated on their revision performance in a simulated real time experiment. Our goal is to find a robust method that gives early turning point signals and steady turning point signals. The revision performance of the methods has been evaluated according to bias, overall revision size and signal stability measures. In a second phase, we investigate if revision performance is improved using stabilizing forecasts or by changing the cycle estimation window from the baseline 6 and 96 months (i.e. filtering out high frequency noise with a cycle length shorter than 6 months and removing trend components with cycle length longer than 96 months) to 12 and 120 months. The results show that, for all tested time series, the PAT de-trending method is outperformed by both the HP or CF filter. In addition, the results indicate that the HP filter outperforms the CF filter in turning point signal stability but has a weaker performance in absolute numerical precision. Short horizon stabilizing forecasts tend to improve revision characteristics of both methods and the changed filter window also delivers more robust turning point estimates.<BR>Ce document présente l’impact des révisions dû à différentes méthodes de lissage et de correction de la tendance (méthodes d'estimation du cycle), comme la méthode PAT avec lissage en utilisant le mois de dominance cyclique (MCD), le double filtre de Hodrick-Prescott (HP) et le filtre Christiano-Fitzgerald (CF). Les différentes méthodes d'estimation du cycle sont évaluées sur leur performance de révision faite à partir d’une simulation en temps réel. Notre objectif est de trouver une méthode robuste qui donne des signaux de point de retournement tôt et stable á la fois. La performance de révisions de ces méthodes a été évaluée en fonction du biais, de la grandeur de la révision et de la stabilité du signal. Nous examinerons ensuite si la performance de la révision peut être améliorée en utilisant des prévisions de stabilisation ou en changeant la fenêtre d'estimation du cycle de base de 6 et 96 mois à une fenêtre de 12 et 120 mois. La fenêtre d’estimation de base correspond à un filtre pour éliminer le bruit (hautes fréquences) avec une longueur de cycle de moins de 6 mois et supprimer la tendance avec une longueur de cycle supérieure à 96 mois. Les résultats montrent que, pour toutes les séries testées, la méthode PAT est moins performante que les deux filtres HP ou CF. En outre, les résultats indiquent que le filtre HP surpasse le filtre CF du point de vue de la stabilité du signal du point de retournement mais sa performance est plus faible quant à la précision numérique absolue. Des prévisions à court terme ont la tendance à améliorer les caractéristiques des révisions des deux méthodes et la modification de la fenêtre de base offre aussi des estimations plus robustes des points de retournement.
    Date: 2011–05–27
  7. By: Patrizia Berti (Department of Mathematics, University of Modena and Reggio Emilia); Luca Pratelli (Accademia Navale di Livorno); Pietro Rigo (Department of Economics and Quantitative Methods, University of Pavia)
    Abstract: Empirical processes for non ergodic data are investigated under uniform distance. Some CLTs, both uniform and non uniform, are proved. In particular, conditions for Bn = n^(1/2) (µn - bn) and Cn = n^(1/2) (µn - an) to converge in distribution are given, where µn is the empirical measure, an the predictive measure, and bn = 1/n sum (ai) for i=0 to n-1. Such conditions can be applied to any adapted sequence of random variables. Various examples and a characterization of conditionally identically distributed sequences are given as well.
    Keywords: Conditional identity in distribution, empirical process, exchangeability, predictive measure, stable convergence.
    Date: 2010–11
  8. By: Caporin, M.; McAleer, M.J.
    Abstract: In the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. In this paper, we provide an empirical comparison of a set of models, namely BEKK, DCC, Corrected DCC (cDCC) of Aeilli (2008), CCC, Exponentially Weighted Moving Average, and covariance shrinking, using historical data of 89 US equities. Our methods follow part of the approach described in Patton and Sheppard (2009), and the paper contributes to the literature in several directions. First, we consider a wide range of models, including the recent cDCC model and covariance shrinking. Second, we use a range of tests and approaches for direct and indirect model comparison, including the Weighted Likelihood Ratio test of Amisano and Giacomini (2007). Third, we examine how the model rankings are influenced by the cross-sectional dimension of the problem.
    Keywords: covariance forecasting;model confidence set;model ranking;MGARCH;model comparison
    Date: 2011–05–31
  9. By: Geert Mesters (Netherlands Institute for the Study of Crime and Law Enforcement); Siem Jan Koopman (VU University Amsterdam); Marius Ooms (VU University Amsterdam)
    Abstract: An exact maximum likelihood method is developed for the estimation of parameters in a non-Gaussian nonlinear log-density function that depends on a latent Gaussian dynamic process with long-memory properties. Our method relies on the method of importance sampling and on a linear Gaussian approximating model from which the latent process can be simulated. Given the presence of a latent long-memory process, we require a modification of the importance sampling technique. In particular, the long-memory process needs to be approximated by a finite dynamic linear process. Two possible approximations are discussed and are compared with each other. We show that an auto-regression obtained from minimizing mean squared prediction errors leads to an effective and feasible method. In our empirical study we analyze ten log-return series from the S&P 500 stock index by univariate and multivariate long-memory stochastic volatility models.
    Keywords: Fractional Integration; Importance Sampling; Kalman Filter; Latent Factors; Stochastic Volatility
    JEL: C33 C43
    Date: 2011–06–27
  10. By: Mario Forni; Luca Gambetti
    Abstract: We derive necessary and sufficient conditions under which a set of variables is information-ally sufficient, i.e. contains enough information to estimate the structural shocks with a VAR model. Based on such conditions, we provide a procedure to test for informational sufficiency. If sufficiency is rejected, we propose a strategy to amend the VAR. Our method can be applied to FAVAR models and can be used to determine how many factors to include in such models. We apply our procedure to a VAR including TFP, unemployment and per-capita hours worked. We find that the three variables are not informationally sucient. When adding missing information, the effects of technology shocks change dramatically.
    Keywords: Structural VAR; non-fundamentalness; information; FAVAR models; technology shocks
    JEL: C32 E32 E62
    Date: 2011–06
  11. By: Ferretti, Nélida E.; Kelmansky, Diana M.; Yohai, Victor J.
    Abstract: In this paper we introduce a goodness-of-fit test based on ranks for ARMA models. The classical portmanteau statistic is generalized to a class of estimators based on ranks. The asymptotic distributions of the proposed statistics are derived. Simulation results suggest that the proposed statistics have good robustness properties for an adequate choice of the score functions.
    Keywords: AR,MA models; Ranks; Goodness-of-fit;
  12. By: Peña, Daniel; Maravall, Agustín
    Abstract: The paper addresses the problem of estimating missing observations in linear, possibly nonstationary, stochastic processes when the model is known. The general case of any possible distribution of missing observations in the time series is considered, and analytical expressions for the optimal estimators and their associated mean squared errors are obtained. These expressions involve solely the elements of the inverse or dual autocorrelation function of the series. This optimal estimator -the conditional expectation of the missing observations given the available ones-is equal oto the estimator that results from filling the missing values in the series with arbitrary numbers, treating these numbers as additive outliers, and removing the outlier effects from the invented numbers using intervention analysis.
    Keywords: Missing observations; Outliers; Intervention analysis; ARIMA models; Inverse autocorrelation function;
  13. By: David E. Giles (Department of Economics, University of Victoria); Ryan T. Godwin
    Abstract: Testing for multivariate cointegration when the data exhibit structural breaks is a problem that is encountered frequently in empirical economic analysis. The standard tests must be modified in this situation, and the asymptotic distributions of the test statistics change accordingly. We supply code that allows practitioners to easily calculate both p-values and critical values for the trace tests of Johansen et al. (2000). Access is also provided to tables of critical values for a broad selection of situations.
    Keywords: Cointegration, structural breaks, trace test, p-values, critical values
    JEL: C12 C32 C87
    Date: 2011–07–04

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.