|
on Econometric Time Series |
Issue of 2019‒12‒16
fifteen papers chosen by Jaqueson K. Galimberti Auckland University of Technology |
By: | Schnaubelt, Matthias |
Abstract: | Machine learning is increasingly applied to time series data, as it constitutes an attractive alternative to forecasts based on traditional time series models. For independent and identically distributed observations, cross-validation is the prevalent scheme for estimating out-of-sample performance in both model selection and assessment. For time series data, however, it is unclear whether forwardvalidation schemes, i.e., schemes that keep the temporal order of observations, should be preferred. In this paper, we perform a comprehensive empirical study of eight common validation schemes. We introduce a study design that perturbs global stationarity by introducing a slow evolution of the underlying data-generating process. Our results demonstrate that, even for relatively small perturbations, commonly used cross-validation schemes often yield estimates with the largest bias and variance, and forward-validation schemes yield better estimates of the out-of-sample error. We provide an interpretation of these results in terms of an additional evolution-induced bias and the sample-size dependent estimation error. Using a large-scale financial data set, we demonstrate the practical significance in a replication study of a statistical arbitrage problem. We conclude with some general guidelines on the selection of suitable validation schemes for time series data. |
Keywords: | machine learning,model selection,model validation,time series,cross-validation |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:zbw:iwqwdp:112019&r=all |
By: | Zhishui Hu (University of Science and Technology of China); Peter C.B. Phillips (Cowles Foundation, Yale University); Qiying Wang (School of Mathematics and Statistics, The University of Sydney) |
Abstract: | This paper develops an asymptotic theory for nonlinear cointegrating power function regression. The framework extends earlier work on the deterministic trend case and allows for both endogeneity and heteroskedasticity, which makes the models and inferential methods relevant to many empirical economic and ï¬ nancial applications, including predictive regression. Accompanying the asymptotic theory of nonlinear regression, the paper establishes some new results on weak convergence to stochastic integrals that go beyond the usual semi-martingale structure and considerably extend existing limit theory, complementing other recent ï¬ ndings on stochastic integral asymptotics. The paper also provides a general framework for extremum estimation limit theory that encompasses stochastically nonstationary time series and should be of wide applicability. |
Keywords: | Nonlinear power regression, Least squares estimation, Nonstationarity, Endogeneity, Heteroscedasticity |
JEL: | C13 C22 |
Date: | 2019–12 |
URL: | http://d.repec.org/n?u=RePEc:cwl:cwldpp:2211&r=all |
By: | Giuseppe Cavaliere; Iliyan Georgiev |
Abstract: | Asymptotic bootstrap validity is usually understood as consistency of the distribution of a bootstrap statistic, conditional on the data, for the unconditional limit distribution of a statistic of interest. From this perspective, randomness of the limit bootstrap measure is regarded as a failure of the bootstrap. We show that such limiting randomness does not necessarily invalidate bootstrap inference if validity is understood as control over the frequency of correct inferences in large samples. We first establish sufficient conditions for asymptotic bootstrap validity in cases where the unconditional limit distribution of a statistic can be obtained by averaging a (random) limiting bootstrap distribution. Further, we provide results ensuring the asymptotic validity of the bootstrap as a tool for conditional inference, the leading case being that where a bootstrap distribution estimates consistently a conditional (and thus, random) limit distribution of a statistic. We apply our framework to several inference problems in econometrics, including linear models with possibly non-stationary regressors, functional CUSUM statistics, conditional Kolmogorov-Smirnov specification tests, the `parameter on the boundary' problem and tests for constancy of parameters in dynamic econometric models. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.12779&r=all |
By: | Schnücker, A.M. |
Abstract: | This paper proposes LASSO estimation specific for panel vector autoregressive (PVAR) models. The penalty term allows for shrinkage for different lags, for shrinkage towards homogeneous coeficients across panel units, for penalization of lags of variables belonging to another cross-sectional unit, and for varying penalization across equations. The penalty parameters therefore build on time series and cross-sectional properties that are commonly found in PVAR models. Simulation results point towards advantages of using the proposed LASSO for PVAR models over ordinary least squares in terms of forecast accuracy. An empirical forecasting application with five countries support these findings. |
Keywords: | Model selection, multi-country model, shrinkage estimation |
JEL: | C13 C32 C33 |
Date: | 2019–11–01 |
URL: | http://d.repec.org/n?u=RePEc:ems:eureir:122072&r=all |
By: | Annalisa Cadonna; Sylvia Fr\"uhwirth-Schnatter; Peter Knaus |
Abstract: | Time-varying parameter (TVP) models are very flexible in capturing gradual changes in the effect of a predictor on the outcome variable. However, in particular when the number of predictors is large, there is a known risk of overfitting and poor predictive performance, since the effect of some predictors is constant over time. We propose a prior for variance shrinkage in TVP models, called triple gamma. The triple gamma prior encompasses a number of priors that have been suggested previously, such as the Bayesian lasso, the double gamma prior and the Horseshoe prior. We present the desirable properties of such a prior and its relationship to Bayesian Model Averaging for variance selection. The features of the triple gamma prior are then illustrated in the context of time varying parameter vector autoregressive models, both for simulated datasets and for a series of macroeconomics variables in the Euro Area. |
Date: | 2019–12 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1912.03100&r=all |
By: | Stephan Smeekes; Etienne Wijler |
Abstract: | We investigate how the possible presence of unit roots and cointegration affects forecasting with Big Data. As most macroeoconomic time series are very persistent and may contain unit roots, a proper handling of unit roots and cointegration is of paramount importance for macroeconomic forecasting. The high-dimensional nature of Big Data complicates the analysis of unit roots and cointegration in two ways. First, transformations to stationarity require performing many unit root tests, increasing room for errors in the classification. Second, modelling unit roots and cointegration directly is more difficult, as standard high-dimensional techniques such as factor models and penalized regression are not directly applicable to (co)integrated data and need to be adapted. We provide an overview of both issues and review methods proposed to address these issues. These methods are also illustrated with two empirical applications. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.10552&r=all |
By: | Alain Hecq; Elisa Voisin |
Abstract: | This paper investigates oil price series using mixed causal-noncausal autoregressive (MAR) models, namely dynamic processes that depend not only on their lags but also on their leads. MAR models have been successfully implemented on commodity prices as they allow to generate nonlinear features such as speculative bubbles. We estimate the probabilities that bubbles in oil price series burst once the series enter an explosive phase. To do so we first evaluate how to adequately detrend nonstationary oil price series while preserving the bubble patterns observed in the raw data. The impact of different filters on the identification of MAR models as well as on forecasting bubble events is investigated using Monte Carlo simulations. We illustrate our findings on WTI and Brent monthly series. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.10916&r=all |
By: | Fabio Franco (University of Rome "Tor Vergata") |
Abstract: | Particle filtering is a useful statistical tool which can be used to make inference on the latent variables and the structural parameters of state space models by employing it inside MCMC algorithms (Flury and Shephard, 2011). It only relies on two assumptions (Gordon et al, 1993): a: The ability to simulate from the dynamic of the model; b: The predictive measurement density can be computed. In practice the second assumption may not be obvious and implementations of particle filter can become difficult to conduct. Gallant, Giacomini and Ragusa (2016) have recently developed a particle filter which does not rely on the structural form of the measurement equation. This method uses a set of moment conditions to induce the likelihood function of a structural model under a GMM criteria. The semiparametric structure allows to use particle filtering where the standard techniques are not applicable or difficult to implement. On the other hand, the GMM representation is less efficient than the standard technique and in some cases it can affect the proper functioning of particle filter and in turn deliver poor estimates. The contribution of this paper is to provide a comparison between the standard techniques, as Kalman filter and standard bootstrap particle filter, and the method proposed by Gallant et al (2016) in order to measure the performance of particle filter with GMM representation. |
Keywords: | Bootstrap particle filter, GMM likelihood representation, Metropolis-Hastings algorithm, Kalman filter, nonlinear/non-Gaussian state space models. |
JEL: | C4 C8 |
Date: | 2019–12–04 |
URL: | http://d.repec.org/n?u=RePEc:rtv:ceisrp:477&r=all |
By: | Massimo Franchi ("Sapienza" University of Rome); Paolo Paruolo (European Commission, Joint Research Centre) |
Abstract: | This paper discusses the concept of cointegrating space for systems integrated of order higher than 1. It is first observed that the notions of (polynomial) cointegrating vectors and of root functions coincide. Second, the cointegrating space is defined as a subspace of the space of rational vectors. Third, it is shown that canonical sets of root functions can be used to generate a basis of the cointegrating space. Fourth, results on how to reduce bases of rational vector spaces to polynomial bases with minimal order (i.e. minimal bases) are shown to imply the separation of cointegrating vectors that potentially do not involve differences of the process from the ones that require them. Finally, it is argued that minimality of polynomial bases and economic identification of cointegrating vectors can be properly combined. |
Keywords: | VAR, Cointegration, I(d), Vector spaces. |
JEL: | C12 C33 C55 |
Date: | 2019–12 |
URL: | http://d.repec.org/n?u=RePEc:sas:wpaper:20192&r=all |
By: | Martínez-Martin, Jaime; Morris, Richard; Onorante, Luca; Piersanti, Fabio M. |
Abstract: | The post-crisis environment has posed important challenges to standard forecasting models. In this paper, we exploit several combinations of a large-scale DSGE structural model with standard reduced-form methods such as (B)VAR (i.e. DSGE-VAR and Augmented-(B)VARDSGE methods) and assess their use for forecasting the Spanish economy. Our empirical findings suggest that: (i) the DSGE model underestimates growth of real variables due to its mean reverting properties in the context of a sample that is difficult to deal with; (ii) in spite of this, reduced-form VARs benefit from the imposition of an economic prior from the structural model; and (iii) pooling information in the form of variables extracted from the structural model with (B)VAR methods does not give rise to any relevant gain in terms of forecasting accuracy. JEL Classification: C54, E37, F3, F41 |
Keywords: | Bayesian VAR, DSGE models, forecast comparison, real time data |
Date: | 2019–12 |
URL: | http://d.repec.org/n?u=RePEc:ecb:ecbwps:20192335&r=all |
By: | Alexander Jurisch |
Abstract: | We develop a method that relates the truncated cumulant-function of the fourth order with the L\'evian cumulant-function. This gives us explicit formulas for the L\'evy-parameters, which allow a real-time analysis of the state of a random-motion. Cumbersome procedures like maximum-likelihood or least-square methods are unnecessary. Furthermore, we treat the L\'evy-system in terms of statistical mechanics and work out it's thermodynamic properties. This also includes a discussion of the fractal nature of relativistic corrections. As examples for a time-series analysis, we apply our results on the time-series of the German DAX and the American S\&P-500\,. |
Date: | 2019–02 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1902.09425&r=all |
By: | Ulrich Horst; Wei Xu |
Abstract: | We provide a general probabilistic framework within which we establish scaling limits for a class of continuous-time stochastic volatility models with self-exciting jump dynamics. In the scaling limit, the joint dynamics of asset returns and volatility is driven by independent Gaussian white noises and two independent Poisson random measures that capture the arrival of exogenous shocks and the arrival of self-excited shocks, respectively. Various well-studied stochastic volatility models with and without self-exciting price/volatility co-jumps are obtained as special cases under different scaling regimes. We analyze the impact of external shocks on the market dynamics, especially their impact on jump cascades and show in a mathematically rigorous manner that many small external shocks may tigger endogenous jump cascades in asset returns and stock price volatility. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.12969&r=all |
By: | Duván Humberto Cataño (University of Antioquia); Carlos Vladimir Rodríguez-Caballero (ITAM and CREATES); Daniel Peña (Universidad Carlos III de Madrid) |
Abstract: | We introduce a non-stationary high-dimensional factor model with time-varying loadings. We propose an estimation procedure based on two stages. First, we estimate common factors by principal components. Afterwards, in the second step, considering the factors estimates as observed, the time-varying loadings are estimated by an iterative procedure of generalized least squares using wavelet functions. We investigate the finite sample features of the proposed methodology by some Monte Carlo simulations. Finally, we use this methodology to study the electricity prices and loads of the Nord Pool power market. |
Keywords: | Factor models, wavelet functions, generalized least squares, electricity prices and loads |
JEL: | C13 C32 Q43 |
Date: | 2019–12–09 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2019-23&r=all |
By: | Peiwan Wang; Lu Zong; Ye Ma |
Abstract: | This study constructs an integrated early warning system (EWS) that identifies and predicts stock market turbulence. Based on switching ARCH (SWARCH) filtering probabilities of the high volatility regime, the proposed EWS first classifies stock market crises according to an indicator function with thresholds dynamically selected by the two-peak method. A hybrid algorithm is then developed in the framework of a long short-term memory (LSTM) network to make daily predictions that alert turmoils. In the empirical evaluation based on ten-year Chinese stock data, the proposed EWS yields satisfying results with the test-set accuracy of $96.6\%$ and on average $2.4$ days of the forewarned period. The model's stability and practical value in real-time decision-making are also proven by the cross-validation and back-testing. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.12596&r=all |
By: | Dimos Kambouroudis (Department of Accounting and Finance, University of Stirling); David McMillan (Department of Accounting and Finance, University of Stirling); Katerina Tsakou (School of Management, Swansea University) |
Abstract: | We examine the role of implied volatility, leverage effect, overnight returns and volatility of realized volatility in forecasting realized volatility by extending the heterogeneous autoregressive (HAR) model to include these additional variables. We find that implied volatility is important in forecasting future realized volatility. In most cases a model that accounts for implied volatility provides a significantly better forecast than more sophisticated models that account for other features of volatility, but exclude the information backed out from option prices. This result is consistent over time. We also assess whether leverage effect, overnight returns and volatility of realized volatility carry any incremental information beyond that captured by implied volatility and past realized volatility. We find that while overnight returns and leverage e˙ect are important for some markets, the volatility of realized volatility is of limited value for most stock markets. |
Keywords: | HAR model, realized volatility, implied volatility, implied volatility effects, leverage effect, overnight returns, GARCH |
Date: | 2019–12–12 |
URL: | http://d.repec.org/n?u=RePEc:swn:wpaper:2019-03&r=all |