nep-ets New Economics Papers
on Econometric Time Series
Issue of 2021‒07‒12
eight papers chosen by
Jaqueson K. Galimberti
Auckland University of Technology

  1. Macroeconomic Forecasting with Large Stochastic Volatility in Mean VARs By Jamie L. Cross; Chenghan Hou; Gary Koop
  2. Comparison of the accuracy in VaR forecasting for commodities using different methods of combining forecasts By Szymon Lis; Marcin Chlebus
  3. Generalized Spatial and Spatiotemporal ARCH Models By Philipp Otto; Wolfgang Schmid
  4. Adjustment coefficients and exact rational expectations in cointegrated vector autoregressive models By Søren Johansen; Anders Ryghn Swensen
  5. Pooled Bewley Estimator of Long-Run Relationships in Dynamic Heterogenous Panels By Alexander Chudik; M. Hashem Pesaran; Ron P. Smith
  6. Volatility Bursts: A discrete-time option model with multiple volatility components By Francesca Lilla
  7. Detecting multiple generalized change-points by isolating single ones By Anastasiou, Andreas; Fryzlewicz, Piotr
  8. Estimation of Common Factors for Microstructure Noise and Efficient Price in a High-frequency Dual Factor Model By Li, Y-N.; Chen, J.; Linton, O.

  1. By: Jamie L. Cross; Chenghan Hou; Gary Koop
    Abstract: Vector autoregressions with stochastic volatility in both the conditional mean and variance are commonly used to estimate the macroeconomic effects of uncertainty shocks. Despite their popularity, intensive computational demands when estimating such models have made out-of-sample forecasting exercises impractical, particularly when working with large data sets. In this article, we propose an efficient Markov chain Monte Carlo (MCMC) algorithm for posterior and predictive inference in such models that facilitates such exercises. The key insight underlying the algorithm is that the (log-)conditional densities of the log-volatilities possess Hessian matrices that are banded. This enables us to build upon recent advances in band and sparse matrix algorithms for state space models. In a simulation exercise, we evaluate the new algorithm numerically and establish its computational and statistical effciency over a conventional particle filter based algorithm. Using macroeconomic data for the US we find that such models generally deliver more accurate point and density forecasts over a conventional benchmark in which stochastic volatility only enters the variance of the model.
    Keywords: Bayesian VARs, Macroeconomic Forecasting, Stochastic Volatility in Mean, State Space Models, Uncertainty
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:bny:wpaper:0100&r=
  2. By: Szymon Lis (Faculty of Economic Sciences, University of Warsaw); Marcin Chlebus (Faculty of Economic Sciences, University of Warsaw)
    Abstract: No model dominates existing VaR forecasting comparisons. This problem may be solved by combine forecasts. This study investigates the daily volatility forecasting for commodities (gold, silver, oil, gas, copper) from 2000-2020 and identifies the source of performance improvements between individual GARCH models and combining forecasts methods (mean, the lowest, the highest, CQOM, quantile regression with the elastic net or LASSO regularization, random forests, gradient boosting, neural network) through the MCS. Results indicate that individual models achieve more accurate VaR forecasts for the confidence level of 0.975, but combined forecasts are more precise for 0.99. In most cases simple combining methods (mean or the lowest VaR) are the best. Such evidence demonstrates that combining forecasts is important to get better results from the existing models. The study shows that combining the forecasts allows for more accurate VaR forecasting, although it’s difficult to find accurate, complex methods.
    Keywords: Combining forecasts, Econometric models, Finance, Financial markets, GARCH models, Neural networks, Regression, Time series, Risk, Value-at-Risk, Machine learning, Model Confidence Set
    JEL: C51 C52 C53 G32 Q01
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:war:wpaper:2021-11&r=
  3. By: Philipp Otto; Wolfgang Schmid
    Abstract: In time-series analyses, particularly for finance, generalized autoregressive conditional heteroscedasticity (GARCH) models are widely applied statistical tools for modelling volatility clusters (i.e., periods of increased or decreased risk). In contrast, it has not been considered to be of critical importance until now to model spatial dependence in the conditional second moments. Only a few models have been proposed for modelling local clusters of increased risks. In this paper, we introduce a novel spatial GARCH process in a unified spatial and spatiotemporal GARCH framework, which also covers all previously proposed spatial ARCH models, exponential spatial GARCH, and time-series GARCH models. In contrast to previous spatiotemporal and time series models, this spatial GARCH allows for instantaneous spill-overs across all spatial units. For this common modelling framework, estimators are derived based on a non-linear least-squares approach. Eventually, the use of the model is demonstrated by a Monte Carlo simulation study and by an empirical example that focuses on real estate prices from 1995 to 2014 across the ZIP-Code areas of Berlin. A spatial autoregressive model is applied to the data to illustrate how locally varying model uncertainties (e.g., due to latent regressors) can be captured by the spatial GARCH-type models.
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2106.10477&r=
  4. By: Søren Johansen (University of Copenhagen and CREATES); Anders Ryghn Swensen (University of Oslo)
    Abstract: In cointegrated vector autoregressive models exact linear rational expectation relations can imply restrictions on the adjustment parameters. We show how such restrictions can be tested, in particular when the restrictions imply weak exogeneity of some variables.
    Keywords: Abstract, Exact rational expectations, Cointegrated VAR model, Reduced rank regression, Adjustment coefficients
    JEL: C32
    Date: 2021–07–01
    URL: http://d.repec.org/n?u=RePEc:aah:create:2021-10&r=
  5. By: Alexander Chudik; M. Hashem Pesaran; Ron P. Smith
    Abstract: This paper, using the Bewley (1979) transformation of the autoregressive distributed lag model, proposes a pooled Bewley (PB) estimator of long-run coefficients for dynamic panels with heterogeneous short-run dynamics, in the same setting as the widely used Pooled Mean Group (PMG) estimator. The Bewley transform enables us to obtain an analytical closed form expression for the PB, which is not available when using the maximum likelihood approach. This lets us establish asymptotic normality of PB as n,T→∞ jointly, allowing for applications with n and T large and of the same order of magnitude, but excluding panels where T is short relative to n. In contrast, asymptotic distribution of PMG estimator was obtained for n fixed and T→∞. Allowing for both n and T large seems to be the more relevant empirical setting, as revealed by numerous applications of the PMG estimator in the literature. Dynamic panel estimators are biased when T is not sufficiently large. Three bias corrections (simulation based, split-panel jackknife and a combined procedure) are investigated using Monte Carlo experiments, of which the combined procedure works best in reducing bias. In contrast to PMG, PB does not weight by estimated variances, which can make it more robust in small samples, though less efficient asymptotically. The PB estimator is illustrated with an application to the aggregate consumption function estimated in the original PMG paper.
    Keywords: Heterogeneous dynamic panels; I(1) regressors; pooled mean group estimator (PMG); Autoregressive-Distributed Lag model (ARDL); Bewley transform; bias correction; split-panel jackknife
    JEL: C12 C13 C23 C33
    Date: 2021–05–27
    URL: http://d.repec.org/n?u=RePEc:fip:feddgw:92809&r=
  6. By: Francesca Lilla
    Abstract: I propose an affine discrete-time model, called Vector Autoregressive Gamma with volatility Bursts (VARG-B) in which volatility experiences, in addition to frequent and small changes, periods of sudden and extreme movements generated by a latent factor which evolves according to the Autoregressive Gamma Zero process. A key advantage of the discrete-time specification is that it makes it possible to estimate the model via the Extended Kalman Filter. Moreover, the VARG-B model leads to a fully analytic conditional Laplace transform, resulting in a closed-form option pricing formula. When estimated on S&P500 index options and returns the new model provides more accurate option pricing and modelling of the IV surface compared with some alternative models.
    Keywords: volatility bursts, ARG-zero, option pricing, Kalman filter, realized volatility
    JEL: C13 G12 G13
    Date: 2021–06
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_1336_21&r=
  7. By: Anastasiou, Andreas; Fryzlewicz, Piotr
    Abstract: We introduce a new approach, called Isolate-Detect (ID), for the consistent estimation of the number and location of multiple generalized change-points in noisy data sequences. Examples of signal changes that ID can deal with are changes in the mean of a piecewise-constant signal and changes, continuous or not, in the linear trend. The number of change-points can increase with the sample size. Our method is based on an isolation technique, which prevents the consideration of intervals that contain more than one change-point. This isolation enhances ID’s accuracy as it allows for detection in the presence of frequent changes of possibly small magnitudes. In ID, model selection is carried out via thresholding, or an information criterion, or SDLL, or a hybrid involving the former two. The hybrid model selection leads to a general method with very good practical performance and minimal parameter choice. In the scenarios tested, ID is at least as accurate as the state-of-the-art methods; most of the times it outperforms them. ID is implemented in the R packages IDetect and breakfast, available from CRAN.
    Keywords: segmentation; symmetric interval expansion; threshold criterion; Schwarz information criterion; SDLL; EP/L014246/1; UKRI fund
    JEL: C1
    Date: 2021–05–24
    URL: http://d.repec.org/n?u=RePEc:ehl:lserod:110258&r=
  8. By: Li, Y-N.; Chen, J.; Linton, O.
    Abstract: We develop the Double Principal Component Analysis (DPCA) based on a dual factor structure for high-frequency intraday returns data contaminated with microstructure noise. The dual factor structure allows a factor structure for the microstructure noise in addition to the factor structure for efficient log-prices. We construct estimators of factors for both efficient log-prices and microstructure noise as well as their common components, and provide uniform consistency of these estimators when the number of assets and the sampling frequency go to infinity. In a Monte Carlo exercise, we compare our DPCA method to a PCA-VECM method. Finally, an empirical analysis of intraday returns of S&P 500 Index constituents provides evidence of co-movement of the microstructure noise that distinguishes from latent systematic risk factors.
    Keywords: Cointegration, Factor model, High-frequency data, Microstructure noise, Non-stationarity
    JEL: C10 C13 C14 C33 C38
    Date: 2021–06–30
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:2150&r=

This nep-ets issue is ©2021 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.