nep-ets New Economics Papers
on Econometric Time Series
Issue of 2009‒12‒11
eleven papers chosen by
Yong Yin
SUNY at Buffalo

  1. Vector Autoregresive Moving Average Identification for Macroeconomic Modeling: Algorithms and Theory By D.S. Poskitt
  2. Recursive linear estimation for discrete time systems in the presence of different multiplicative observation noises By Carlos Sánchez-González; Tere M. García-Muñoz
  3. Inconsistency of a Unit Root Test against Stochastic Unit Root Processes By Daisuke Nagakura
  4. Forecasting Macroeconomic Time Series With Locally Adaptive Signal Extraction By Giordani, Paolo; Villani, Mattias
  5. How do you make a time series sing like a choir? Using the Hilbert-Huang transform to extract embedded frequencies from economic or financial time series By Crowley, Patrick M
  6. Forecasting Realized Volatility with Linear and Nonlinear Models By McAleer, M.; Medeiros, M.C.
  7. A Robust Criterion for Determining the Number of Factors in Approximate Factor Models By Lucia Alessi; Matteo Barigozzi; Marco Capasso
  8. Nuisance parameters, composite likelihoods and a panel of GARCH models By Cavit Pakel; Neil Shephard; Kevin Sheppard
  9. Asymptotic behaviour of the CUSUM of squares test under stochastic and deterministic time trends By Jouni Sohkanen; B. Nielsen
  10. Test for cointegration rank in general vector autoregressions By B. Nielsen
  11. Estimation and forecasting in large datasets with conditionally heteroskedastic dynamic common factors. By Lucia Alessi; Matteo Barigozzi; Marco Capasso

  1. By: D.S. Poskitt
    Abstract: This paper develops a new methodology for identifying the structure of VARMA time series models. The analysis proceeds by examining the echelon canonical form and presents a fully automatic data driven approach to model specification using a new technique to determine the Kronecker invariants. A novel feature of the inferential procedures developed here is that they work in terms of a canonical scalar ARMAX representation in which the exogenous regressors are given by predetermined contemporaneous and lagged values of other variables in the VARMA system. This feature facilitates the construction of algorithms which, from the perspective of macroeconomic modeling, are efficacious in that they do not use AR approximations at any stage. Algorithms that are applicable to both asymptotically stationary and unit-root, partially nonstationary (cointegrated) time series models are presented. A sequence of lemmas and theorems show that the algorithms are based on calculations that yield strongly consistent estimates.
    Keywords: Keywords: Algorithms, asymptotically stationary and cointegrated time series, echelon
    JEL: C32 C52 C63 C87
    Date: 2009–11–12
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2009-12&r=ets
  2. By: Carlos Sánchez-González (Department of Economic Theory and Economic History, University of Granada.); Tere M. García-Muñoz (Department of Economic Theory and Economic History, University of Granada.)
    Abstract: This paper describes a design for a least mean square error estimator in discrete time systems where the components of the state vector, in measurement equation, are corrupted by different multiplicative noises in addition to observation noise. We show how known results can be considered a particular case of the algorithm stated in this paper
    Keywords: State estimation, multiplicative noise, uncertain observations
    Date: 2009–11–27
    URL: http://d.repec.org/n?u=RePEc:gra:wpaper:09/09&r=ets
  3. By: Daisuke Nagakura (Institute for Monetary and Economic Studies, Bank of Japan (E-mail: daisuke.nagakura@boj.or.jp))
    Abstract: In this paper, we develop the asymptotic theory of Hwang and Basawa (2005) for explosive random coefficient autoregressive (ERCA) models. Applying the theory, we prove that a locally best invariant (LBI) test in McCabe and Tremayne (1995), which is for the null of a unit root (UR) process against the alternative of a stochastic unit root (STUR) process, is inconsistent against a class of ERCA models. This class includes a class of STUR processes as special cases. We show, however, that the well-known Dickey-Fuller (DF) UR tests and an LBI test of Lee (1998) are consistent against a particular case of this class of ERCA models.
    Keywords: Locally Best Invariant Test, Consistency, Dickey-Fuller Test, LBI, RCA, STUR
    JEL: C12
    Date: 2009–10
    URL: http://d.repec.org/n?u=RePEc:ime:imedps:09-e-23&r=ets
  4. By: Giordani, Paolo (Research Department, Central Bank of Sweden); Villani, Mattias (Research Department, Central Bank of Sweden)
    Abstract: We introduce a non-Gaussian dynamic mixture model for macroeconomic forecasting. The Locally Adaptive Signal Extraction and Regression (LASER) model is designed to capture relatively persistent AR processes (signal) contaminated by high frequency noise. The distribution of the innovations in both noise and signal is robustly modeled using mixtures of normals. The mean of the process and the variances of the signal and noise are allowed to shift suddenly or gradually at unknown locations and number of times. The model is then capable of capturing movements in the mean and conditional variance of a series as well as in the signal-to-noise ratio. Four versions of the model are used to forecast six quarterly US and Swedish macroeconomic series. We conclude that (i) allowing for infrequent and large shifts in mean while imposing normal iid errors often leads to erratic forecasts, (ii) such shifts/breaks versions of the model can forecast well if robustified by allowing for non-normal errors and time varying variances, (iii) infrequent and large shifts in error variances outperform smooth and continuous shifts substantially when it comes to interval coverage, (iv) for point forecasts, robust time varying specifications improve slightly upon fixed parameter specifications on average, but the relative performances can differ sizably in various sub-samples, v) for interval forecasts, robust versions that allow for infrequent shifts in variances perform substantially and consistently better than time invariant specifications.
    Keywords: Bayesian inferene; Foreast evaluation; Regime swithing; State-space modeling; Dynamic Mixture models
    JEL: C11 C53
    Date: 2009–10–01
    URL: http://d.repec.org/n?u=RePEc:hhs:rbnkwp:0234&r=ets
  5. By: Crowley, Patrick M (College of Business, Texas A&M University)
    Abstract: The Hilbert-Huang transform (HHT) was developed late last century but has still to be introduced to the vast majority of economists. The HHT transform is a way of extracting the frequency mode features of cycles embedded in any time series using an adaptive data method that can be applied without making any assumptions about stationarity or linear data-generating properties. This paper introduces economists to the two constituent parts of the HHT transform, namely empirical mode decomposition (EMD) and Hilbert spectral analysis. Illustrative applications using HHT are also made to two financial and three economic time series.
    Keywords: business cycles; growth cycles; Hilbert-Huang transform (HHT); empirical mode decomposition (EMD); economic time series; non-stationarity; spectral analysis
    JEL: C49 E00
    Date: 2009–11–21
    URL: http://d.repec.org/n?u=RePEc:hhs:bofrdp:2009_032&r=ets
  6. By: McAleer, M.; Medeiros, M.C. (Erasmus Econometric Institute)
    Abstract: In this paper we consider a nonlinear model based on neural networks as well as linear models to forecast the daily volatility of the S&P 500 and FTSE 100 indexes. As a proxy for daily volatility, we consider a consistent and unbiased estimator of the integrated volatility that is computed from high frequency intra-day returns. We also consider a simple algorithm based on bagging (bootstrap aggregation) in order to specify the models analyzed in the paper.
    Keywords: financial econometrics;volatility forecasting;neural networks;nonlinear models;realized volatility;bagging
    Date: 2009–11–24
    URL: http://d.repec.org/n?u=RePEc:dgr:eureir:1765017303&r=ets
  7. By: Lucia Alessi; Matteo Barigozzi; Marco Capasso
    Abstract: We modify the criterion by Bai and Ng (2002) for determining the number of factors in approximate factor models. As in the original criterion, for any given number of factors we estimate the common and idiosyncratic components of the model by applying principal component analysis. We select the true number of factors as the number that minimizes the variance explained by the idiosyncratic component. In order to avoid overparametrization, minimization is subject to penalization. At this step, we modify the original procedure by multiplying the penalty function by a positive real number, which allows us to tune its penalizing power, by analogy with the method used by Hallin and Liška (2007) in the frequency domain. The contribution of this paper is twofold. First, our criterion retains the asymptotic properties of the original criterion, but corrects its tendency to overestimate the true number of factors. Second, we provide a computationally easy way to implement the new method by iteratively applying the original criterion. Monte Carlo simulations show that our criterion is in general more robust than the original one. A better performance is achieved in particular in the case of large idiosyncratic disturbances. These conditions are the most difficult for detecting a factor structure but are not unusual in the empirical context. Two applications on a macroeconomic and a financial dataset are also presented.
    Keywords: Approximate factor models, information criterion, model selection.
    JEL: C52
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2009_023&r=ets
  8. By: Cavit Pakel (Department of Economics and Oxford-Man Institute, University of Oxford, Oxford); Neil Shephard (Oxford-Man Institute and Department of Economics, University of Oxford, Oxford); Kevin Sheppard (Oxford-Man Institute and Department of Economics, University of Oxford, Oxford)
    Abstract: We investigate the properties of the composite likelihood (CL) method for (T ×N_T ) GARCH panels. The defining feature of a GARCH panel with time series length T is that, while nuisance parameters are allowed to vary across N_T series, other parameters of interest are assumed to be common. CL pools information across the panel instead of using information available in a single series only. Simulations and empirical analysis illustrate that in reasonably large T CL performs well. However, due to the estimation error introduced through nuisance parameter estimation, CL is subject to the “incidental parameter” problem for small T.
    Keywords: ARCH models; composite likelihood; nuisance parameters; panel data
    JEL: C01 C14 C32
    Date: 2009–10–02
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0912&r=ets
  9. By: Jouni Sohkanen (Dept of Economics, University of Oxford, Oxford); B. Nielsen (Nuffield College, University of Oxford, Oxford.)
    Abstract: We undertake a generalization of the cumulative sum of squares (CUSQ) test to the case of non-stationary autoregressive distributed lag models with quite general deterministic time trends. The test may be validly implemented with either ordinary least squares residuals or standardized forecast errors. Simulations suggest that there is little at stake in the choice between the two in the unit root case under Gaussian innovations, and that there is only very modest variation in the finite sample distribution across the parameter space.
    Date: 2009–08–31
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0909&r=ets
  10. By: B. Nielsen (Dept of Economics and Nuffield College, University of Oxford, Oxford.)
    Abstract: Johansen derived the asymptotic theory for his cointegration rank test statisic for a vector autoregression where the parameters are restricted so the process is integrated of order one. It is investigated to what extent these parameter restrictions are binding. The eigenvalues of Johansen’s eigenvalue problem are shown to have the same consistency rates accross the parameter space. The test statistic is shown to have the usual asymptotic distribution as long as the possibilities of additional unit roots and of singular explosiveness are ruled out. To prove the results the convergence of stochastic integrals with respect to singular explosive processes is considered.
    Date: 2009–09–22
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:0910&r=ets
  11. By: Lucia Alessi (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.); Matteo Barigozzi (European Center for the Advanced Research in Economics and Statistics (ECARES), Université libre de Bruxelles, Belgium.); Marco Capasso (Utrecht School of Economics, Utrecht University,  P.O. Box 80.115,  3508 TC  Utrecht, The Netherlands.)
    Abstract: We propose a new method for multivariate forecasting which combines Dynamic Factor and multivariate GARCH models. The information contained in large datasets is captured by few dynamic common factors, which we assume being conditionally heteroskedastic. After presenting the model, we propose a multi-step estimation technique which combines asymptotic principal components and multivariate GARCH. We also prove consistency of the estimated conditional covariances. We present simulation results in order to assess the finite sample properties of the estimation technique. Finally, we carry out two empirical applications respectively on macroeconomic series, with a particular focus on different measures of inflation, and on financial asset returns. Our model outperforms the benchmarks in forecasting the inflation level, its conditional variance and the volatility of returns. Moreover, we are able to predict all the conditional covariances among the observable series. JEL Classification: C52, C53.
    Keywords: Dynamic Factor Models, Multivariate GARCH, Conditional Covariance, Inflation Forecasting, Volatility Forecasting.
    Date: 2009–11
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20091115&r=ets

This nep-ets issue is ©2009 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.