|
on Econometric Time Series |
By: | Giuseppe Cavaliere (University of Bologna); Morten Ørregaard Nielsen (Queen's University and CREATES); A.M. Robert Taylor (University of Essex) |
Abstract: | We consider estimation and inference in fractionally integrated time series models driven by shocks which can display conditional and unconditional heteroskedasticity of unknown form. Although the standard conditional sum-of-squares (CSS) estimator remains consistent and asymptotically normal in such cases, unconditional heteroskedasticity inflates its variance matrix by a scalar quantity, lambda>1, thereby inducing a loss in efficiency relative to the unconditionally homoskedastic case, lambda=1. We propose an adaptive version of the CSS estimator, based on non-parametric kernel-based estimation of the unconditional variance process. This eliminates the factor lambda from the variance matrix, thereby delivering the same asymptotic efficiency as that attained by the standard CSS estimator in the unconditionally homoskedastic case and, hence, asymptotic efficiency under Gaussianity. The asymptotic variance matrices of both the standard and adaptive CSS estimators depend on any conditional heteroskedasticity and/or weak parametric autocorrelation present in the shocks. Consequently, asymptotically pivotal inference can be achieved through the development of confidence regions or hypothesis tests using either heteroskedasticity robust standard errors and/or a wild bootstrap. Monte Carlo simulations and empirical applications are included to illustrate the practical usefulness of the methods proposed. |
Keywords: | adaptive estimation, conditional sum-of-squares, fractional integration, heteroskedasticity, quasi-maximum likelihood estimation, wild bootstrap |
Date: | 2017–10 |
URL: | http://d.repec.org/n?u=RePEc:qed:wpaper:1390&r=ets |
By: | Chambers, MJ; McCrorie, JR; Thornton, MA |
Abstract: | This chapter provides a survey of methods of continuous time modelling based on an exact discrete time representation. It begins by highlighting the techniques involved with the derivation of an exact discrete time representation of an underlying continuous time model,providing specificc details for a second-order linear system of stochastic differential equations. Issues of parameter identification, Granger causality, nonstationarity, and mixed frequency data are addressed, all being important considerations in applications in economics and other disciplines. Although the focus is on Gaussian estimation of the exact discrete time model, alternative time domain (state space) and frequency domain approaches are also discussed. Computational issues are explored and two new empirical applications are included along with a discussion of applications in the field of macroeconometric modelling. |
Keywords: | Continuous time; exact discrete time representation; stochastic di erential equation; Gaussian estimation; identi cation; Granger causality; nonstationarity; mixed frequency data; computation; macroeconometric modelling. |
Date: | 2017–10 |
URL: | http://d.repec.org/n?u=RePEc:esx:essedp:20497&r=ets |
By: | Kasun Bandara; Christoph Bergmeir; Slawek Smyl |
Abstract: | With the advent of Big Data, nowadays in many applications databases containing large quantities of similar time series are available. Forecasting time series in these domains with traditional univariate forecasting procedures leaves great potentials for producing accurate forecasts untapped. Recurrent neural networks, and in particular Long Short-Term Memory (LSTM) networks have proven recently that they are able to outperform state-of-the-art univariate time series forecasting methods in this context, when trained across all available time series. However, if the time series database is heterogeneous accuracy may degenerate, so that on the way towards fully automatic forecasting methods in this space, a notion of similarity between the time series needs to be built into the methods. To this end, we present a prediction model using LSTMs on subgroups of similar time series, which are identified by time series clustering techniques. The proposed methodology is able to consistently outperform the baseline LSTM model, and it achieves competitive results on benchmarking datasets, in particular outperforming all other methods on the CIF2016 dataset. |
Date: | 2017–10 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1710.03222&r=ets |
By: | Pedersen, Rasmus Søndergaard |
Abstract: | We consider robust inference for an autoregressive parameter in a stationary autoregressive model with GARCH innovations when estimation is based on least squares estimation. As the innovations exhibit GARCH, they are by construction heavy-tailed with some tail index $\kappa$. The rate of consistency as well as the limiting distribution of the least squares estimator depend on $\kappa$. In the spirit of Ibragimov and Müller (“t-statistic based correlation and heterogeneity robust inference”, Journal of Business & Economic Statistics, 2010, vol. 28, pp. 453-468), we consider testing a hypothesis about a parameter based on a Student’s t-statistic for a fixed number of subsamples of the original sample. The merit of this approach is that no knowledge about the value of $\kappa$ nor about the rate of consistency and the limiting distribution of the least squares estimator is required. We verify that the one-sided t-test is asymptotically a level $\alpha$ test whenever $\alpha \le $ 5% uniformly over $\kappa \ge 2$, which includes cases where the innovations have infinite variance. A simulation experiment suggests that the finite-sample properties of the test are quite good. |
Keywords: | t-test, AR-GARCH, regular variation, least squares estimation |
JEL: | C12 C22 C46 C51 |
Date: | 2017–10–04 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:81979&r=ets |