nep-ets New Economics Papers
on Econometric Time Series
Issue of 2020‒09‒07
ten papers chosen by
Jaqueson K. Galimberti
Auckland University of Technology

  1. How to estimate a VAR after March 2020 By Lenza, Michele; Primiceri, Giorgio E.
  2. Stationarity and ergodicity of Markov switching positive conditional mean models By Aknouche, Abdelhakim; Francq, Christian
  3. S-ARMA model and Wold decomposition for covariance stationary interval-valued time series processes By Jules Sadefo Kamdem; Babel Raïssa Guemdjo Kamdem; Carlos Ougouyandjou
  4. Long vs Short Time Scales: the Rough Dilemma and Beyond By Matthieu Garcin; Martino Grasselli
  5. Distributed ARIMA Models for Ultra-long Time Series By Xiaoqian Wang; Yanfei Kang; Rob J Hyndman; Feng Li
  6. Time Series Analysis of COVID-19 Infection Curve: A Change-Point Perspective By Feiyu Jiang; Zifeng Zhao; Xiaofeng Shao
  7. A Novel Approach to Predictive Accuracy Testing in Nested Environments By Jean-Yves Pitarakis
  8. Volatility model calibration with neural networks a comparison between direct and indirect methods By Dirk Roeder; Georgi Dimitroff
  9. Neural Network-based Automatic Factor Construction By Jie Fang; Jianwu Lin; Shutao Xia; Yong Jiang; Zhikang Xia; Xiang Liu
  10. Spectral Analysis of Multivariate Time Series By von Sachs, Rainer

  1. By: Lenza, Michele; Primiceri, Giorgio E.
    Abstract: This paper illustrates how to handle a sequence of extreme observations—such as those recorded during the COVID-19 pandemic—when estimating a Vector Autoregression, which is the most popular time-series model in macroeconomics. Our results show that the ad-hoc strategy of dropping these observations may be acceptable for the purpose of parameter estimation. However, disregarding these recent data is inappropriate for forecasting the future evolution of the economy, because it vastly underestimates uncertainty. JEL Classification: C32, E32, E37, C11
    Keywords: COVID-19, density forecasts, outliers, volatility
    Date: 2020–08
  2. By: Aknouche, Abdelhakim; Francq, Christian
    Abstract: A general Markov-Switching autoregressive conditional mean model, valued in the set of nonnegative numbers, is considered. The conditional distribution of this model is a finite mixture of nonnegative distributions whose conditional mean follows a GARCH-like dynamics with parameters depending on the state of a Markov chain. Three different variants of the model are examined depending on how the lagged-values of the mixing variable are integrated into the conditional mean equation. The model includes, in particular, Markov mixture versions of various well-known nonnegative time series models such as the autoregressive conditional duration (ACD) model, the integer-valued GARCH (INGARCH) model, and the Beta observation driven model. Under contraction in mean conditions, it is shown that the three variants of the model are stationary and ergodic when the stochastic order and the mean order of the mixing distributions are equal. The proposed conditions match those already known for Markov-switching GARCH models. We also give conditions for finite marginal moments. Applications to various mixture and Markov mixture count, duration and proportion models are provided.
    Keywords: Autoregressive Conditional Duration, Count time series models, finite mixture models, Ergodicity, Integer-valued GARCH, Markov mixture models.
    JEL: C10 C18 C22 C25
    Date: 2020–08–18
  3. By: Jules Sadefo Kamdem (MRE - Montpellier Recherche en Economie - UM - Université de Montpellier); Babel Raïssa Guemdjo Kamdem (IMSP - Institut de Mathématiques et de Sciences Physiques [Bénin] (Université d’Abomey-Calavi (UAC))); Carlos Ougouyandjou
    Abstract: The main purpose of this work is to contribute to the study of set-valued random variablesby providing a kind of Wold decomposition theorem for interval-valued processes. As theset of set-valued random variables is not a vector space, the Wold decomposition theorem asestablished in 1938 by Herman Wold is not applicable for them. So, a notion of pseudovectorspace is introduced and used to establish a generalization of the Wold decomposition theoremthat works for interval-valued covariance stationary time series processes. Before this, Set-valued autoregressive and moving average (S-ARMA) time series process is defined by takinginto account an arithmetical difference between random sets and random real variables.
    Keywords: Wold décomposition,stationary time series,interval-valued time series processes,ARMA model
    Date: 2020
  4. By: Matthieu Garcin; Martino Grasselli
    Abstract: Using a large dataset on major stock indexes and FX rates, we test the robustness of the rough fractional volatility model over different time scales. We include the estimation error as well as the microstructure noise into the analysis. Our findings lead to new stylized facts regarding the volatility that are not described by models introduced so far: in the fractal analysis using the absolute moment approach, log-log plots are nonlinear and reveal very low perceived Hurst exponents at small scales, consistent with the rough framework, and higher perceived Hurst exponents for larger scales, along with stationarity of the volatility. These results, obtained for time series of realized volatilities, are confirmed by another measure of volatility, namely Parkinson's volatility, taking into account its specificities regarding measurement errors.
    Date: 2020–08
  5. By: Xiaoqian Wang; Yanfei Kang; Rob J Hyndman; Feng Li
    Abstract: Providing forecasts for ultra-long time series plays a vital role in various activities, such as investment decisions, industrial production arrangements, and farm management. This paper develops a novel distributed forecasting framework to tackle challenges associated with forecasting ultra-long time series by utilizing the industrystandard MapReduce framework. The proposed model combination approach facilitates distributed time series forecasting by combining the local estimators of ARIMA (AutoRegressive Integrated Moving Average) models delivered from worker nodes and minimizing a global loss function. In this way, instead of unrealistically assuming the data generating process (DGP) of an ultra-long time series stays invariant, we make assumptions only on the DGP of subseries spanning shorter time periods. We investigate the performance of the proposed distributed ARIMA models on an electricity demand dataset. Compared to ARIMA models, our approach results in significantly improved forecasting accuracy and computational efficiency both in point forecasts and prediction intervals, especially for longer forecast horizons. Moreover, we explore some potential factors that may affect the forecasting performance of our approach.
    Keywords: ultra-long time series, distributed forecasting, ARIMA models, least squares approximatio, MapReduce
    Date: 2020
  6. By: Feiyu Jiang; Zifeng Zhao; Xiaofeng Shao
    Abstract: In this paper, we model the trajectory of the cumulative confirmed cases and deaths of COVID-19 (in log scale) via a piecewise linear trend model. The model naturally captures the phase transitions of the epidemic growth rate via change-points and further enjoys great interpretability due to its semiparametric nature. On the methodological front, we advance the nascent self-normalization (SN) technique (Shao, 2010) to testing and estimation of a single change-point in the linear trend of a nonstationary time series. We further combine the SN-based change-point test with the NOT algorithm (Baranowski et al., 2019) to achieve multiple change-point estimation. Using the proposed method, we analyze the trajectory of the cumulative COVID-19 cases and deaths for 30 major countries and discover interesting patterns with potentially relevant implications for effectiveness of the pandemic responses by different countries. Furthermore, based on the change-point detection algorithm and a flexible extrapolation function, we design a simple two-stage forecasting scheme for COVID-19 and demonstrate its promising performance in predicting cumulative deaths in the U.S.
    Date: 2020–07
  7. By: Jean-Yves Pitarakis
    Abstract: We introduce a new approach for comparing the predictive accuracy of two nested models that bypasses the difficulties caused by the degeneracy of the asymptotic variance of forecast error loss differentials used in the construction of commonly used predictive comparison statistics. Our approach continues to rely on the out of sample MSE loss differentials between the two competing models, leads to nuisance parameter free Gaussian asymptotics and is shown to remain valid under flexible assumptions that can accommodate heteroskedasticity and the presence of mixed predictors (e.g. stationary and local to unit root). A local power analysis also establishes its ability to detect departures from the null in both stationary and persistent settings. Simulations calibrated to common economic and financial applications indicate that our methods have strong power with good size control across commonly encountered sample sizes.
    Date: 2020–08
  8. By: Dirk Roeder; Georgi Dimitroff
    Abstract: In a recent paper "Deep Learning Volatility" a fast 2-step deep calibration algorithm for rough volatility models was proposed: in the first step the time consuming mapping from the model parameter to the implied volatilities is learned by a neural network and in the second step standard solver techniques are used to find the best model parameter. In our paper we compare these results with an alternative direct approach where the the mapping from market implied volatilities to model parameters is approximated by the neural network, without the need for an extra solver step. Using a whitening procedure and a projection of the target parameter to [0,1], in order to be able to use a sigmoid type output function we found that the direct approach outperforms the two-step one for the data sets and methods published in "Deep Learning Volatility". For our implementation we use the open source tensorflow 2 library. The paper should be understood as a technical comparison of neural network techniques and not as an methodically new Ansatz.
    Date: 2020–07
  9. By: Jie Fang; Jianwu Lin; Shutao Xia; Yong Jiang; Zhikang Xia; Xiang Liu
    Abstract: Instead of conducting manual factor construction based on traditional and behavioural finance analysis, academic researchers and quantitative investment managers have leveraged Genetic Programming (GP) as an automatic feature construction tool in recent years, which builds reverse polish mathematical expressions from trading data into new factors. However, with the development of deep learning, more powerful feature extraction tools are available. This paper proposes Neural Network-based Automatic Factor Construction (NNAFC), a tailored neural network framework that can automatically construct diversified financial factors based on financial domain knowledge and a variety of neural network structures. The experiment results show that NNAFC can construct more informative and diversified factors than GP, to effectively enrich the current factor pool. For the current market, both fully connected and recurrent neural network structures are better at extracting information from financial time series than convolution neural network structures. Moreover, new factors constructed by NNAFC can always improve the return, Sharpe ratio, and the max draw-down of a multi-factor quantitative investment strategy due to their introducing more information and diversification to the existing factor pool.
    Date: 2020–08
  10. By: von Sachs, Rainer
    Date: 2019–01–01

This nep-ets issue is ©2020 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.