nep-ets New Economics Papers
on Econometric Time Series
Issue of 2023‒01‒16
seven papers chosen by
Jaqueson K. Galimberti
Auckland University of Technology

  1. Estimating Time-Varying Networks for High-Dimensional Time Series By Chen, J.; Li, D.; Li, Y.; Linton, O. B.
  2. Boosting the HP Filter for Trending Time Series with Long Range Dependence By Eva Biswas; Farzad Sabzikar; Peter C. B. Phillips
  3. Estimating the non-Gaussian Dimension in Structural Linear Systems By Miguel Cabello
  4. Optimal Forecasts in the Presence of Discrete Structural Breaks under Long Memory By Mboya, Mwasi; Sibbertsen, Philipp
  5. Specification tests for non-Gaussian structural vector autoregressions By Dante Amengual; Gabriele Fiorentini; Enrique Sentana
  6. Tensor Principal Component Analysis By Andrii Babii; Eric Ghysels; Junsu Pan
  7. Bayesian Forecasting in the 21st Century: A Modern Review By Gael M. Martin; David T. Frazier; Worapree Maneesoonthorn; Ruben Loaiza-Maya; Florian Huber; Gary Koop; John Maheu; Didier Nibbering; Anastasios Panagiotelis

  1. By: Chen, J.; Li, D.; Li, Y.; Linton, O. B.
    Abstract: We explore time-varying networks for high-dimensional locally stationary time series, using the large VAR model framework with both the transition and (error) precision matrices evolving smoothly over time. Two types of time-varying graphs are investigated: one containing directed edges of Granger causality linkages, and the other containing undirected edges of partial correlation linkages. Under the sparse structural assumption, we propose a penalised local linear method with time-varying weighted group LASSO to jointly estimate the transition matrices and identify their significant entries, and a time-varying CLIME method to estimate the precision matrices. The estimated transition and precision matrices are then used to determine the time-varying network structures. Under some mild conditions, we derive the theoretical properties of the proposed estimates including the consistency and oracle properties. In addition, we extend the methodology and theory to cover highly-correlated large-scale time series, for which the sparsity assumption becomes invalid and we allow for common factors before estimating the factor-adjusted time-varying networks. We provide extensive simulation studies and an empirical application to a large U.S. macroeconomic dataset to illustrate the finite-sample performance of our methods.
    Keywords: CLIME, Factor model, Granger causality, lasso, local linear smoothing, partial correlation, time-varying network, VAR
    JEL: C13 C14 C32 C38
    Date: 2022–12–14
  2. By: Eva Biswas (Department of Statistics, Iowa State University); Farzad Sabzikar (Department of Statistics, Iowa State University); Peter C. B. Phillips (Cowles Foundation, Yale University)
    Abstract: This paper extends recent asymptotic theory developed for the Hodrick Prescott (HP) filter and boosted HP (bHP) filter to long range dependent time series that have fractional Brownian motion (fBM) limit processes after suitable standardization. Under general conditions it is shown that the asymptotic form of the HP filter is a smooth curve, analogous to the finding in Phillips and Jin (2021) for integrated time series and series with deterministic drifts. Boosting the filter using the iterative procedure suggested in Phillips and Shi (2021) leads under well defined rate conditions to a consistent estimate of the fBM limit process or the fBM limit process with an accompanying deterministic drift when that is present. A stopping criterion is used to automate the boosting algorithm, giving a data-determined method for practical implementation. The theory is illustrated in simulations and two real data examples that highlight the differences between simple HP filtering and the use of boosting. The analysis is assisted by employing a uniformly and almost surely convergent trigonometric series representation of fBM.
    Date: 2022–08
  3. By: Miguel Cabello
    Abstract: Statistical identification of structural vector auto-regressive, moving-average (SVARMA) models requires structural shocks to be an independent process, to have mutually independent components, and each component must be non-Gaussian distributed. Taken as granted the former two conditions, common procedures for testing joint Gaussianity of structural errors vector is not sufficient to validate the latter requirement, because rejection of the null hypothesis only implicates the existence of at least one structural shock that is non-Gaussian distributed. Therefore, it is required to estimate the number of non-Gaussian components in the structural disturbances vector. This work abords such problem with a sequential testing procedure, which generalizes the current proposals, designed only for fundamental SVAR models, and allows for possibly non-fundamental SVARMA models. Under our setup, current procedures are invalid since reduced-form errors are a possibly infinite, linear combination of present, past and future values of structural errors, and they are only serially uncorrelated, but not independent. Our approach employs third and fourth order cumulant spectrum to construct some arrays whose rank is equivalent to the number non-Gaussian structural errors. Montecarlo simulations show that our approach estimates satisfactorily the number of non-Gaussian components.
    Date: 2022–12
  4. By: Mboya, Mwasi; Sibbertsen, Philipp
    Abstract: We develop methods to obtain optimal forecast under long memory in the presence of a discrete structural break based on different weighting schemes for the observations. We observe significant changes in the forecasts when long-range dependence is taken into account. Using Monte Carlo simulations, we confirm that our methods substantially improve the forecasting performance under long memory. We further present an empirical application to in inflation rates that emphasizes the importance of our methods.
    Keywords: Long memory; Forecasting; Structural break; Optimal weight; ARFIMA model
    JEL: C12 C22
    Date: 2022–12
  5. By: Dante Amengual (CEMFI, Centro de Estudios Monetarios y Financieros); Gabriele Fiorentini (Università di Firenze); Enrique Sentana (CEMFI, Centro de Estudios Monetarios y Financieros)
    Abstract: We propose specification tests for independent component analysis and structural vector autoregressions that assess the assumed cross-sectional independence of the non-Gaussian shocks. Our tests effectively compare their joint cumulative distribution with the product of their marginals at discrete or continuous grids of values for its arguments, the latter yielding a consistent test. We explicitly consider the sampling variability from using consistent estimators to compute the shocks. We study the finite sample size of our tests in several simulation exercises, with special attention to resampling procedures. We also show that they have non-negligible power against a variety of empirically plausible alternatives.
    Keywords: Consistest tests, copulas, finite normal mixtures, independence tests, pseudo maximum likelihood estimators.
    JEL: C32 C52
    Date: 2022–12
  6. By: Andrii Babii; Eric Ghysels; Junsu Pan
    Abstract: In this paper, we develop new methods for analyzing high-dimensional tensor datasets. A tensor factor model describes a high-dimensional dataset as a sum of a low-rank component and an idiosyncratic noise, generalizing traditional factor models for panel data. We propose an estimation algorithm, called tensor principal component analysis (PCA), which generalizes the traditional PCA applicable to panel data. The algorithm involves unfolding the tensor into a sequence of matrices along different dimensions and applying PCA to the unfolded matrices. We provide theoretical results on the consistency and asymptotic distribution for tensor PCA estimator of loadings and factors. The algorithm demonstrates good performance in Mote Carlo experiments and is applied to sorted portfolios.
    Date: 2022–12
  7. By: Gael M. Martin; David T. Frazier; Worapree Maneesoonthorn; Ruben Loaiza-Maya; Florian Huber; Gary Koop; John Maheu; Didier Nibbering; Anastasios Panagiotelis
    Abstract: The Bayesian statistical paradigm provides a principled and coherent approach to probabilistic forecasting. Uncertainty about all unknowns that characterize any forecasting problem -- model, parameters, latent states -- is factored into the forecast distribution, with forecasts conditioned only on what is known or observed. Allied with the elegance of the method, Bayesian forecasting is now underpinned by the burgeoning field of Bayesian computation, which enables Bayesian forecasts to be produced for virtually any problem, no matter how large, or complex. The current state of play in Bayesian forecasting is the subject of this review. The aim is to provide readers with an overview of modern approaches to the field, set in some historical context. Whilst our primary focus is on applications in the fields of economics and finance, and their allied disciplines, sufficient general details about implementation are provided to aid and inform all investigators.
    Date: 2022–12

This nep-ets issue is ©2023 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.