nep-ets New Economics Papers
on Econometric Time Series
Issue of 2017‒11‒26
six papers chosen by
Yong Yin
SUNY at Buffalo

  1. Sparse Bayesian vector autoregressions in huge dimensions By Gregor Kastner; Florian Huber
  2. Financial Time Series Prediction Using Deep Learning By Ariel Navon; Yosi Keller
  3. A New Approach Toward Detecting Structural Breaks in Vector Autoregressive Models By Florian Huber; Gregor Kastner; Martin Feldkircher
  4. Common Factors, Trends, and Cycles in Large Datasets By Matteo Barigozzi; Matteo Luciani
  5. A Bias-Corrected Method of Moments Approach to Estimation of Dynamic Short-T Panels By Chudik, Alexander; Pesaran, M. Hashem
  6. Spurious Principal Components By Franses, Ph.H.B.F.; Janssens, E.

  1. By: Gregor Kastner; Florian Huber
    Abstract: We develop a Bayesian vector autoregressive (VAR) model that is capable of handling vast dimensional information sets. Three features are introduced to permit reliable estimation of the model. First, we assume that the reduced-form errors in the VAR feature a factor stochastic volatility structure, allowing for conditional equation-by-equation estimation. Second, we apply a Dirichlet-Laplace prior to the VAR coefficients to cure the curse of dimensionality. Finally, since simulation-based methods are needed to simulate from the joint posterior distribution, we utilize recent innovations to efficiently sample from high-dimensional multivariate Gaussian distributions that improve upon recent algorithms by large margins. In the empirical exercise we apply the model to US data and evaluate its forecasting capabilities.
    Date: 2017–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1704.03239&r=ets
  2. By: Ariel Navon; Yosi Keller
    Abstract: In this work we present a data-driven end-to-end Deep Learning approach for time series prediction, applied to financial time series. A Deep Learning scheme is derived to predict the temporal trends of stocks and ETFs in NYSE or NASDAQ. Our approach is based on a neural network (NN) that is applied to raw financial data inputs, and is trained to predict the temporal trends of stocks and ETFs. In order to handle commission-based trading, we derive an investment strategy that utilizes the probabilistic outputs of the NN, and optimizes the average return. The proposed scheme is shown to provide statistically significant accurate predictions of financial market trends, and the investment strategy is shown to be profitable under this challenging setup. The performance compares favorably with contemporary benchmarks along two-years of back-testing.
    Date: 2017–11
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1711.04174&r=ets
  3. By: Florian Huber; Gregor Kastner; Martin Feldkircher
    Abstract: Incorporating structural changes into time series models is crucial during turbulent economic periods. In this paper, we propose a flexible means of estimating vector autoregressions with time-varying parameters (TVP-VARs) by introducing a threshold process that is driven by the absolute size of parameter changes. This enables us to detect whether a given regression coefficient is constant or time-varying. When applied to a medium-scale macroeconomic US dataset our model yields precise density and turning point predictions, especially during economic downturns, and provides new insights on the changing effects of increases in short-term interest rates over time.
    Date: 2016–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1607.04532&r=ets
  4. By: Matteo Barigozzi; Matteo Luciani
    Abstract: This paper considers a non-stationary dynamic factor model for large datasets to disentangle long-run from short-run co-movements. We first propose a new Quasi Maximum Likelihood estimator of the model based on the Kalman Smoother and the Expectation Maximisation algorithm. The asymptotic properties of the estimator are discussed. Then, we show how to separate trends and cycles in the factors by mean of eigenanalysis of the estimated non-stationary factors. Finally, we employ our methodology on a panel of US quarterly macroeconomic indicators to estimate aggregate real output, or Gross Domestic Output, and the output gap.
    Keywords: EM Algorithm ; Gross Domestic Output ; Kalman Smoother ; Non-stationary Approximate Dynamic Factor Model ; Output Gap ; Quasi Maximum Likelihood ; Trend-Cycle Decomposition
    JEL: C32 C38 E00
    Date: 2017–11–13
    URL: http://d.repec.org/n?u=RePEc:fip:fedgfe:2017-111&r=ets
  5. By: Chudik, Alexander (Federal Reserve Bank of Dallas); Pesaran, M. Hashem (University of Southern California)
    Abstract: This paper contributes to the GMM literature by introducing the idea of self-instrumenting target variables instead of searching for instruments that are uncorrelated with the errors, in cases where the correlation between the target variables and the errors can be derived. The advantage of the proposed approach lies in the fact that, by construction, the instruments have maximum correlation with the target variables and the problem of weak instrument is thus avoided. The proposed approach can be applied to estimation of a variety of models such as spatial and dynamic panel data models. In this paper we focus on the latter and consider both univariate and multivariate panel data models with short time dimension. Simple Bias-corrected Methods of Moments (BMM) estimators are proposed and shown to be consistent and asymptotically normal, under very general conditions on the initialization of the processes, individual-specific effects, and error variances allowing for heteroscedasticity over time as well as cross-sectionally. Monte Carlo evidence document BMM’s good small sample performance across different experimental designs and sample sizes, including in the case of experiments where the system GMM estimators are inconsistent. We also find that the proposed estimator does not suffer size distortions and has satisfactory power performance as compared to other estimators.
    JEL: C12 C13 C23
    Date: 2017–09–01
    URL: http://d.repec.org/n?u=RePEc:fip:feddgw:327&r=ets
  6. By: Franses, Ph.H.B.F.; Janssens, E.
    Abstract: The Principal Component Regression is often used to forecast macroeconomic variables when there are many predictors. In this letter, we argue that it makes sense to pre-whiten the predictors before including these in a PCR. With simulation experiments, we show that without such pre-whitening, spurious principal components can appear, and that these can become spuriously significant in a PCR. With an illustration to annual inflation rates for five African countries, we show that non-spurious principal components can be genuinely relevant in empirical forecasting models.
    Keywords: Principal Component Regression, Pre-whitening, Spurious Regressions
    JEL: C52
    Date: 2017–11–01
    URL: http://d.repec.org/n?u=RePEc:ems:eureir:102704&r=ets

This nep-ets issue is ©2017 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.