
on Econometric Time Series 
By:  Rasmus Pedersen (Department of Economics  University of Copenhagen  KU  University of Copenhagen); Olivier Wintenberger (LSTA  Laboratoire de Statistique Théorique et Appliquée  UPMC  Université Pierre et Marie Curie  Paris 6  CNRS  Centre National de la Recherche Scientifique, University of Copenhagen  Department of Mathematical Sciences  KU  University of Copenhagen) 
Abstract:  Conditions for geometric ergodicity of multivariate autoregressive conditional heteroskedasticity (ARCH) processes, with the socalled BEKK (Baba, Engle, Kraft, and Kroner) parametrization, are considered. We show for a class of BEKKARCH processes that the invariant distribution is regularly varying. In order to account for the possibility of different tail indices of the marginals, we consider the notion of vector scaling regular variation, in the spirit of Perfekt (1997, Advances in Applied Probability, 29, pp. 138164). The characterization of the tail behavior of the processes is used for deriving the asymptotic properties of the sample covariance matrices. 
Keywords:  geometric ergodicity,asymptotic properties,Stochastic recurrence equations,Markov processes,regular variation,multivariate ARCH 
Date:  2017 
URL:  http://d.repec.org/n?u=RePEc:hal:journl:hal01436267&r=ets 
By:  Boris Buchmann; Kevin W. Lu; Dilip B. Madan 
Abstract:  The weak variancealphagamma process is a multivariate L\'evy process constructed by weakly subordinating Brownian motion, possibly with correlated components with an alphagamma subordinator. It generalises the variancealphagamma process of Semeraro constructed by traditional subordination. We compare three parameter estimation methods for the weak variancealphagamma process, method of moments, maximum likelihood estimation (MLE) and digital moment estimation (DME). We derive a condition for Fourier invertibility needed to apply MLE and show in our simulations that MLE produces a better fit when this condition holds, while DME produces a better fit when it is violated. We also find that the weak variancealphagamma process exhibits a wider range of dependence and produces a significantly better fit than the variancealphagamma process on a S&P500FTSE100 data set, and that DME produces the best fit in this situation. 
Date:  2018–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1801.08852&r=ets 
By:  J. Eduardo VeraVald\'es 
Abstract:  The fractional difference operator remains to be the most popular mechanism to generate long memory due to the existence of efficient algorithms for their simulation and forecasting. Nonetheless, there is no theoretical argument linking the fractional difference operator with the presence of long memory in real data. In this regard, one of the most predominant theoretical explanations for the presence of long memory is crosssectional aggregation of persistent micro units. Yet, the type of processes obtained by crosssectional aggregation differs from the one due to fractional differencing. Thus, this paper develops fast algorithms to generate and forecast long memory by crosssectional aggregation. Moreover, it is shown that the antipersistent phenomenon that arises for negative degrees of memory in the fractional difference literature is not present for crosssectionally aggregated processes. Pointedly, while the autocorrelations for the fractional difference operator are negative for negative degrees of memory by construction, this restriction does not apply to the crosssectional aggregated scheme. We show that this has implications for long memory tests in the frequency domain, which will be misspecified for crosssectionally aggregated processes with negative degrees of memory. Finally, we assess the forecast performance of highorder $AR$ and $ARFIMA$ models when the long memory series are generated by crosssectional aggregation. Our results are of interest to practitioners developing forecasts of long memory variables like inflation, volatility, and climate data, where aggregation may be the source of long memory. 
Date:  2018–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1801.06677&r=ets 
By:  Douglas Patterson; Melvin Hinich; Denisa Roberts 
Abstract:  This article develops a statistical test for the null hypothesis of strict stationarity of a discrete time stochastic process. When the null hypothesis is true, the second order cumulant spectrum is zero at all the discrete Fourier frequency pairs present in the principal domain of the cumulant spectrum. The test uses a frame (window) averaged sample estimate of the second order cumulant spectrum to build a test statistic that has an asymptotic complex standard normal distribution. We derive the test statistic, study the size and power properties of the test, and demonstrate its implementation with intraday stock market return data. The test has conservative size properties and good power to detect varying variance and unit root in the presence of varying variance. 
Date:  2018–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1801.06727&r=ets 
By:  Hiroyuki Kasahara; Katsumi Shimotsu 
Abstract:  Markov regime switching models have been used in numerous empirical studies in economics and finance. However, the asymptotic distribution of the likelihood ratio test statistic for testing the number of regimes in Markov regime switching models has been an unresolved problem. This paper derives the asymptotic distribution of the likelihood ratio test statistic for testing the null hypothesis of $M_0$ regimes against the alternative hypothesis of $M_0 + 1$ regimes for any $M_0 \geq 1$ both under the null hypothesis and under local alternatives. We show that the contiguous alternatives converge to the null hypothesis at a rate of $n^{1/8}$ in regime switching models with normal density. The asymptotic validity of the parametric bootstrap is also established. 
Date:  2018–01 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1801.06862&r=ets 
By:  Emanuele Bacchiocchi; Andrea Bastianin; Alessandro Missale; Eduardo Rossi 
Abstract:  We develop a new VAR model for structural analysis with mixedfrequency data. The MIDASSVAR model allows to identify structural dynamic links exploiting the information contained in variables sampled at different frequencies. It also provides a general framework to test homogeneous frequencybased representations versus mixedfrequency data models. A set of Monte Carlo experiments suggests that the test performs well both in terms of size and power. The MIDASSVAR is then used to study how monetary policy and financial market volatility impact on the dynamics of gross capital inflows to the US. While no relation is found when using standard quarterly data, exploiting the variability present in the series within the quarter shows that the effect of an interest rate shock is greater the longer the time lag between the month of the shock and the end of the quarter 
Date:  2018–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1802.00793&r=ets 
By:  Fischer, Thomas; Krauss, Christopher; Treichel, Alex 
Abstract:  We present a comprehensive simulation study to assess and compare the performance of popular machine learning algorithms for time series prediction tasks. Specifically, we consider the following algorithms: multilayer perceptron (MLP), logistic regression, naïve Bayes, knearest neighbors, decision trees, random forests, and gradientboosting trees. These models are applied to time series from eight data generating processes (DGPs)  reflecting different linear and nonlinear dependencies (base case). Additional complexity is introduced by adding discontinuities and varying degrees of noise. Our findings reveal that advanced machine learning models are capable of approximating the optimal forecast very closely in the base case, with nonlinear models in the lead across all DGPs  particularly the MLP. By contrast, logistic regression is remarkably robust in the presence of noise, thus yielding the most favorable accuracy metrics on raw data, prior to preprocessing. When introducing adequate preprocessing techniques, such as first differencing and local outlier factor, the picture is reversed, and the MLP as well as other nonlinear techniques once again become the modeling techniques of choice. 
Date:  2018 
URL:  http://d.repec.org/n?u=RePEc:zbw:iwqwdp:022018&r=ets 
By:  Baumeister, Christiane; Hamilton, James 
Abstract:  Traditional approaches to structural vector autoregressions can be viewed as special cases of Bayesian inference arising from very strong prior beliefs. These methods can be generalized with a less restrictive formulation that incorporates uncertainty about the identifying assumptions themselves. We use this approach to revisit the importance of shocks to oil supply and demand. Supply disruptions turn out to be a bigger factor in historical oil price movements and inventory accumulation a smaller factor than implied by earlier estimates. Supply shocks lead to a reduction in global economic activity after a significant lag, whereas shocks to oil demand do not. 
Keywords:  Bayesian inference; Measurement error; oil prices; sign restrictions; vector autoregressions 
JEL:  C32 E32 Q43 
Date:  2017–12 
URL:  http://d.repec.org/n?u=RePEc:cpr:ceprdp:12532&r=ets 