nep-ets New Economics Papers
on Econometric Time Series
Issue of 2015‒05‒30
thirteen papers chosen by
Yong Yin
SUNY at Buffalo

  1. Inference and Testing Breaks in Large Dynamic Panels with Strong Cross Sectional Dependence By Javier Hidalgo; Marcia M Schafgans
  2. Nonparametric likelihood for volatility under high frequency data By Lorenzo Camponovo; Yukitoshi Matsushita; Taisuke Otsu
  3. Regularization for Spatial Panel Time Series Using the Adaptive LASSO By Clifford Lam; Pedro Souza
  4. A Cusum Test of Common Trends in Large Heterogeneous Panels By Javier Hidalgo; Jungyoon Lee
  5. Small-time asymptotics for Gaussian self-similar stochastic volatility models By Archil Gulisashvili; Frederi Viens; Xin Zhang
  6. Bayesian Linear Regression with Conditional Heteroskedasticity By Yanyun Zhao
  7. Volatility forecasting using global stochastic financial trends extracted from non-synchronous data By Grigoryeva, Lyudmila; Ortega, Juan-Pablo; Peresetsky, Anatoly
  8. Spectral Approach to Parameter-Free Unit Root Testing By Natalia Bailey; Liudas Giraitis
  9. Sigma point flters for dynamic nonlinear regime switching models By Andrew Binning; Junior Maih
  10. "Volatility and Quantile Forecasts by Realized Stochastic Volatility Models with Generalized Hyperbolic Distribution" By Makoto Takahashi; Toshiaki Watanabe; Yasuhiro Omori
  11. GARCH Models, Tail Indexes and Error Distributions: An Empirical Investigation By Roman Horváth; Boril Sopov
  12. Autocorrelation in an unobservable global trend: Does it help to forecast market returns? By Peresetsky, Anatoly; Yakubov, Ruslan
  13. Structural GARCH: The Volatility-Leverage Connection By Robert Engle; Emil Siriwardane

  1. By: Javier Hidalgo; Marcia M Schafgans
    Abstract: This paper is concerned with various issues related to inference in large dynamic panel data models (where both n and T increase without bound) in the presence of, possibly, strong cross-sectional dependence. Our first aim is to provide a Central Limit Theorem for estimators of the slope parameters of the model under mild conditions. To that end, we extend and modify existing results available in the literature. Our second aim is to study two, although similar, tests for breaks/homogeneity in the time dimension. The first test is based on the CUSUM principle; whereas the second test is based on a Hausman-Durbin-Wu approach. Some of the key features of the tests are that they have nontrivial power when the number of individuals, for which the slope parameters may differ, is a "negligible" fraction or when the break happens to be towards the end of the sample. Due to the fact that the asymptotic distribution of the tests may not provide a good approximation for their finite sample distribution, we describe a simple bootstrap algorithm to obtain (asymptotic) valid critical values for our statistics. An important and surprising feature of the bootstrap is that there is no need to know the underlying model of the cross-sectional dependence, and hence the bootstrap does not require to select any bandwidth parameter for its implementation, as is the case with moving block bootstrap methods which may not be valid with cross-sectional dependence and may depend on the particular ordering of the individuals. Finally, we present a Monte-Carlo simulation analysis to shed some light on the small sample behaviour of the tests and their bootstrap analogues.
    Keywords: Large panel data, dynamic models, cross-sectional strong-dependence, central limit theorems, homogeneity, bootstrap algorithms
    JEL: C12 C13 C23
    Date: 2015–04
  2. By: Lorenzo Camponovo; Yukitoshi Matsushita; Taisuke Otsu
    Abstract: We propose a nonparametric likelihood inference method for the integrated volatility under high frequency financial data. The nonparametric likelihood statistic, which contains the conventional statistics such as empirical likelihood and Pearson's chi-square as special cases, is not asymptotically pivotal under the so-called infill asymptotics, where the number of high frequency observations in a fixed time interval increases to infinity. We show that multiplying a correction term recovers the chi-square limiting distribution. Furthermore, we establish Bartlett correction for our modified nonparametric likelihood statistic under the constant and general non-constant volatility cases. In contrast to the existing literature, the empirical likelihood statistic is not Bartlett correctable under the infill asymptotics. However, by choosing adequate tuning constants for the power divergence family, we show that the second order refinement to the order n^2 can be achieved.
    Keywords: Nonparametric likelihood, Volatility, High frequency data
    JEL: C14
    Date: 2015–01
  3. By: Clifford Lam; Pedro Souza
    Abstract: This paper proposes a model for estimating the underlying cross-sectional dependence structure of a large panel of time series. Technical difficulties meant such a structure is usually assumed before further analysis. We propose to estimate this by penalizing the elements in the spatial weight matrices using the adaptive LASSO proposed by Zou (2006). Non-asymptotic oracle inequalities and the asymptotic sign consistency of the estimators are proved when the dimension of the time series can be larger than the sample size, and they tend to infinity jointly. Asymptotic normality of the LASSO/adaptive LASSO estimator for the model regression parameter is also presented. All the proofs involve non-standard analysis of LASSO/adaptive LASSO estimators, since our model, albeit like a standard regression, always has the response vector as one of the covariates. A block coordinate descent algorithm is introduced, with simulations and a real data analysis carried out to demonstrate the performance of our estimators.
    Keywords: spatial econometrics, adaptive LASSO, sign consistency, asymptotic normality, non-asymptotic oracle inequalities, spatial weight matrices
    JEL: C33 C4 C52
    Date: 2014–11
  4. By: Javier Hidalgo; Jungyoon Lee
    Abstract: This paper examines a nonparametric CUSUM-type test for common trends in large panel data sets with individual fixed effects. We consider, as in Zhang, Su and Phillips (2012), a partial linear regression model with unknown functional form for the trend component, although our test does not involve local smoothings. This conveniently forgoes the need to choose a bandwidth parameter, which due to a lack of a clear and sensible information criteria it is difficult for testing purposes. We are able to do so after making use that the number of individuals increases with no limit. After removing the parametric component of the model, when the errors are homoscedastic, our test statistic converges to a Gaussian process whose critical values are easily tabulated. We also examine the consequences of having heteroscedasticity as well as discussing the problem of how to compute valid critical values due to the very complicated covariance structure of the limiting process. Finally, we present a small Monte-Carlo experiment to shed some light on the finite sample performance of the test.
    Keywords: Common Trends, large data set, Partial linear models,Bootstrap algorithms
    JEL: C12 C13 C23
    Date: 2014–08
  5. By: Archil Gulisashvili; Frederi Viens; Xin Zhang
    Abstract: We consider the class of self-similar Gaussian stochastic volatility models, and compute the small-time (near-maturity) asymptotics for the corresponding asset price density, the call and put pricing functions, and the implied volatilities. Unlike the well-known model-free behavior for extreme-strike asymptotics, small-time behaviors of the above depend heavily on the model, and require a control of the asset price density which is uniform with respect to the asset price variable, in order to translate into results for call prices and implied volatilities. Away from the money, we express the asymptotics explicitly using the volatility process' self-similarity parameter H, its first Karhunen-Lo\`{e}ve eigenvalue at time 1, and the latter's multiplicity. Several model-free estimators for H result. At the money, a separate study is required: the asymptotics for small time depend instead on the integrated variance's moments of orders 1/2 and 3/2, and the estimator for H sees an affine adjustment, while remaining model-free.
    Date: 2015–05
  6. By: Yanyun Zhao
    Abstract: In this paper we consider adaptive Bayesian semiparametric analysis of the linear regression model in the presence of conditional heteroskedasticity. The distribution of the error term on predictors are modelled by a normal distribution with covariate-dependent variance. We show that a rate-adaptive procedure for all smoothness levels of this standard deviation function is performed if the prior is properly chosen. More specifically, we derive adaptive posterior distribution rate up to a logarithm factor for the conditional standard deviation based on a transformation of hierarchical Gaussian spline prior and log-spline prior respectively
    Keywords: Bayesian linear regression , Conditional heteroskedasticity , Rate of convergence , Posterior distribution , Adaptation , Hierarchical Gaussian spline prior , Log-spline prior
    Date: 2015–04
  7. By: Grigoryeva, Lyudmila; Ortega, Juan-Pablo; Peresetsky, Anatoly
    Abstract: This paper introduces a method based on the use of various linear and nonlinear state space models that uses non-synchronous data to extract global stochastic financial trends (GST). These models are specifically constructed to take advantage of the intraday arrival of closing information coming from different international markets in order to improve the quality of volatility description and forecasting performances. A set of three major asynchronous international stock market indices is used in order to empirically show that this forecasting scheme is capable of significant performance improvements when compared with those obtained with standard models like the dynamic conditional correlation (DCC) family.
    Keywords: multivariate volatility modeling and forecasting, global stochastic trend, extended Kalman filter, CAPM, dynamic conditional correlations (DCC), non-synchronous data
    JEL: C32 C5
    Date: 2015
  8. By: Natalia Bailey (Queen Mary University of London); Liudas Giraitis (Queen Mary University of London)
    Abstract: A relatively simple frequency-type testing procedure for unit root potentially contaminated by an additive stationary noise is introduced, which encompasses general settings and allows for linear trends. The proposed test for unit root versus stationarity is based on a finite number of periodograms computed at low Fourier frequencies. It is not sensitive to the selection of tuning parameters defining the range of frequencies so long as they are in the vicinity of zero. The test does not require augmentation, has parameter-free non-standard asymptotic distribution and is correctly sized. The consistency rate under the alternative of stationarity reveals the relation between the power of the test and the long-run variance of the process. The finite sample performance of the test is explored in a Monte Carlo simulation study, and its empirical application suggests rejection of the unit root hypothesis for some of the Nelson-Plosser time series.
    Keywords: Unit root test, Additive noise, Parameter-free distribution
    JEL: C21 C23
    Date: 2015–05
  9. By: Andrew Binning (Norges Bank (Central Bank of Norway)); Junior Maih (Norges Bank (Central Bank of Norway) and BI Norwegian Business School)
    Abstract: In this paper we take three well known Sigma Point Filters, namely the Unscented Kalman Filter, the Divided Difference Filter, and the Cubature Kalman Filter, and extend them to allow for a very general class of dynamic nonlinear regime switching models. Using both a Monte Carlo study and real data, we investigate the properties of our proposed filters by using a regime switching DSGE model solved using nonlinear methods. We find that the proposed filters perform well. They are both fast and reasonably accurate, and as a result they will provide practitioners with a convenient alternative to Sequential Monte Carlo methods. We also investigate the concept of observability and its implications in the context of the nonlinear filters developed and propose some heuristics. Finally, we provide in the RISE toolbox, the codes implementing these three novel filters.
    Keywords: Regime Switching, Higher-order Perturbation, Sigma Point Filters, Nonlinear DSGE estimation, Observability
    Date: 2015–05–18
  10. By: Makoto Takahashi (Graduate School of Economics, Osaka University); Toshiaki Watanabe (Institute of Economic Research, Hitotsubashi University); Yasuhiro Omori (Faculty of Economics, The University of Tokyo)
    Abstract: The predictive performance of the realized stochastic volatility model of Takahashi, Omori, and Watanabe (2009), which incorporates the asymmetric stochastic volatility model with the realized volatility, is investigated. Considering well known characteristics of nancial returns, heavy tail and negative skewness, the model is extended by employing a wider class distribution, the generalized hyperbolic skew Student's t-distribution, for nancial returns. With the Bayesian estimation scheme via Markov chain Monte Carlo method, the model enables us to estimate the parameters in the return distribution and in the model jointly. It also makes it possible to forecast volatility and return quantiles by sampling from their posterior distributions jointly. The model is applied to quantile forecasts of nancial returns such as value-at-risk and expected shortfall as well as volatility forecasts and those forecasts are evaluated by various tests and performance measures. Empirical results with the US and Japanese stock indices, Dow Jones Industrial Average and Nikkei 225, show that the extended model improves the volatility and quantile forecasts especially in some volatile periods. --
    Date: 2015–05
  11. By: Roman Horváth (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nábreží 6, 111 01 Prague 1, Czech Republic; Institute of Information Theory and Automation, Academy of Sciences of the Czech Republic, Pod Vodarenskou Vezi 4, 182 00, Prague, Czech Republic); Boril Sopov (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nábreží 6, 111 01 Prague 1, Czech Republic)
    Abstract: We perform a large simulation study to examine the extent to which various generalized autoregressive conditional heteroskedasticity (GARCH) models capture extreme events in stock market returns. We estimate Hill's tail indexes for individual S&P 500 stock market returns ranging from 1995{2014. and compare these to the tail indexes produced by simulating GARCH models. Our results suggest that actual and simulated values differ greatly for GARCH models with normal conditional distributions, which underestimate the tail risk. By contrast, the GARCH models with Student's t conditional distributions capture the tail shape more accurately, with GARCH and GJR-GARCH being the top performers.
    Keywords: GARCH, extreme events, S&P 500 study, tail index
    JEL: C15 C58 G17
    Date: 2015–05
  12. By: Peresetsky, Anatoly; Yakubov, Ruslan
    Abstract: In this paper a Kalman-filter type model is used to extract a global stochastic trend from discrete non-synchronous data on daily stock market index returns from different markets . The model allows for the autocorrelation in the global stochastic trend, which means that its increments are predictable. It does not necessarily mean the predictability of market returns, since the global trend is unobservable. The performance of the model for the forecast of market returns is explored for three markets: Japan, UK, US.
    Keywords: financial market integration; stock market returns; state space model; Kalman filter; non-synchronous data; market returns forecast
    JEL: C49 C58 F36 G10 G15
    Date: 2015
  13. By: Robert Engle (New York University Stern School of Business); Emil Siriwardane (Office of Financial Research)
    Abstract: We propose a new model of volatility where financial leverage amplifies equity volatility by what we call the "leverage multiplier." The exact specification is motivated by standard structural models of credit; however, our parametrization departs from the classic Merton (1974) model and can accommodate environments where the firm's asset volatility is stochastic, asset returns can jump, and asset shocks are nonnormal. In addition, our specification nests both a standard GARCH and the Merton model, which allows for a statistical test of how leverage interacts with equity volatility. Empirically, the Structural GARCH model outperforms a standard asymmetric GARCH model for approximately 74 percent of the financial firms we analyze. We then apply the Structural GARCH model to two empirical applications: the leverage effect and systemic risk measurement. As a part of our systemic risk analysis, we define a new measure called "precautionary capital" that uses our model to quantify the advantages of regulation aimed at reducing financial firm leverage.
    Keywords: Structural GARCH, Volatility, Leverage
    Date: 2014–10–23

This nep-ets issue is ©2015 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.