nep-ets New Economics Papers
on Econometric Time Series
Issue of 2015‒05‒16
five papers chosen by
Yong Yin
SUNY at Buffalo

  1. Validity of Edgeworth expansions for realized volatility estimators By Ulrich Hounyo; Bezirgen Veliyev
  2. Effects of polynomial trends on detrending moving average analysis By Ying-Hui Shao; Gao-Feng Gu; Zhi-Qiang Jiang; Wei-Xing Zhou
  3. Model uncertainty and the forecast accuracy of ARMA models: A survey By Mazzeu Gonçalves; Henrique Joao; Esther Ruiz; Helena Veiga
  4. Frontiers in Time Series and Financial Econometrics: An Overview By Shiqing Ling; Michael McAleer; Howell Tong
  5. Interval-valued Time Series Models: Estimation based on Order Statistics. Exploring the Agriculture Marketing Service Data By Gloria Gonzalez-Rivera; Wei Lin

  1. By: Ulrich Hounyo (Oxford-Man Institute, University of Oxford, and Aarhus University and CREATES); Bezirgen Veliyev (Aarhus University and CREATES)
    Abstract: The main contribution of this paper is to establish the formal validity of Edgeworth expansions for realized volatility estimators. First, in the context of no microstructure effects, our results rigorously justify the Edgeworth expansions for realized volatility derived in Gonalves and Meddahi (2009). Second, we show that the validity of the Edgeworth expansions for realized volatility may not cover the optimal two-point distribution wild bootstrap proposed by Gonçalves and Meddahi (2009). Then, we propose a new optimal nonlattice distribution which ensures the second-order correctness of the bootstrap. Third, in the presence of microstructure noise, based on our Edgeworth expansions, we show that the new optimal choice proposed in the absence of noise is still valid in noisy data for the pre-averaged realized volatility estimator proposed by Podolskij and Vetter (2009). Finally, we show how confidence intervals for integrated volatility can be constructed using these Edgeworth expansions for noisy data. Our Monte Carlo simulations show that the intervals based on the Edgeworth corrections have improved the finite sample properties relatively to the conventional intervals based on the normal approximation.
    Keywords: Realized volatility, pre-averaging, bootstrap, Edgeworth expansions, confidence intervals.
    JEL: C15 C22 C58
    Date: 2015–05–03
    URL: http://d.repec.org/n?u=RePEc:aah:create:2015-21&r=ets
  2. By: Ying-Hui Shao (ECUST); Gao-Feng Gu (ECUST); Zhi-Qiang Jiang (ECUST); Wei-Xing Zhou (ECUST)
    Abstract: The detrending moving average (DMA) algorithm is one of the best performing methods to quantify the long-term correlations in nonstationary time series. Many long-term correlated time series in real systems contain various trends. We investigate the effects of polynomial trends on the scaling behaviors and the performances of three widely used DMA methods including backward algorithm (BDMA), centered algorithm (CDMA) and forward algorithm (FDMA). We derive a general framework for polynomial trends and obtain analytical results for constant shifts and linear trends. We find that the behavior of the CDMA method is not influenced by constant shifts. In contrast, linear trends cause a crossover in the CDMA fluctuation functions. We also find that constant shifts and linear trends cause crossovers in the fluctuation functions obtained from the BDMA and FDMA methods. When a crossover exists, the scaling behavior at small scales comes from the intrinsic time series while that at large scales is dominated by the constant shifts or linear trends. We also derive analytically the expressions of crossover scales and show that the crossover scale depends on the strength of the polynomial trend, the Hurst index, and in some cases (linear trends for BDMA and FDMA) the length of the time series. In all cases, the BDMA and the FDMA behave almost the same under the influence of constant shifts or linear trends. Extensive numerical experiments confirm excellently the analytical derivations. We conclude that the CDMA method outperforms the BDMA and FDMA methods in the presence of polynomial trends.
    Date: 2015–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1505.02750&r=ets
  3. By: Mazzeu Gonçalves; Henrique Joao; Esther Ruiz; Helena Veiga
    Abstract: The objective of this paper is to survey the literature on the effects of model uncertainty on the forecast accuracy of linear univariate ARMA models. We consider three specific uncertainties: parameter estimation, error distribution and lag order. We also survey the procedures proposed to deal with each of these sources of uncertainty. The results are illustrated with simulated data.
    Keywords: Bayesian forecast, Bootstrap, Model misspecification, Parameter uncertainty , Bootstrap , Model misspecification , Parameter uncertainty
    Date: 2015–05
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws1508&r=ets
  4. By: Shiqing Ling (Department of Mathematics Hong Kong University of Science and Technology Hong Kong, China); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute, The Netherlands, Department of Quantitative Economics, Complutense University of Madrid, and Institute of Economic Research, Kyoto University.); Howell Tong (Emeritus Professor Department of Statistics. London School of Economics)
    Abstract: Two of the fastest growing frontiers in econometrics and quantitative finance are time series and financial econometrics. Significant theoretical contributions to financial econometrics have been made by experts in statistics, econometrics, mathematics, and time series analysis. The purpose of this special issue of the journal on “Frontiers in Time Series and Financial Econometrics” is to highlight several areas of research by leading academics in which novel methods have contributed significantly to time series and financial econometrics, including forecasting co-volatilities via factor models with asymmetry and long memory in realized covariance, prediction of Lévy-driven CARMA processes, functional index coefficient models with variable selection, LASSO estimation of threshold autoregressive models, high dimensional stochastic regression with latent factors, endogeneity and nonlinearity, sign-based portmanteau test for ARCH-type models with heavy-tailed innovations, toward optimal model averaging in regression models with time series errors, high dimensional dynamic stochastic copula models, a misspecification test for multiplicative error models of non-negative time series processes, sample quantile analysis for long-memory stochastic volatility models, testing for independence between functional time series, statistical inference for panel dynamic simultaneous equations models, specification tests of calibrated option pricing models, asymptotic inference in multiple-threshold double autoregressive models, a new hyperbolic GARCH model, intraday value-at-risk: an asymmetric autoregressive conditional duration approach, refinements in maximum likelihood inference on spatial autocorrelation in panel data, statistical inference of conditional quantiles in nonlinear time series models, quasi-likelihood estimation of a threshold diffusion process, threshold models in time series analysis - some reflections, and generalized ARMA models with martingale difference errors.
    Keywords: Time series, Financial econometrics, Threshold models, Conditional volatility, Stochastic volatility, Copulas, Conditional duration
    JEL: C22 C52 C58 G32
    Date: 2015–02
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1504&r=ets
  5. By: Gloria Gonzalez-Rivera (Department of Economics, University of California Riverside); Wei Lin (Capital University of Economics and Business)
    Abstract: The current regression models for interval-valued data ignore the extreme nature of the lower and upper bounds of intervals. We propose a new estimation approach that considers the bounds of the interval as realizations of the max/min order statistics coming from a sample of n_t random draws from the conditional density of an underlying stochastic process {Y_t}. This approach is important for data sets for which the relevant information is only available in interval format, e.g., low/high prices. We are interested in the characterization of the latent process as well as in the modeling of the bounds themselves. We estimate a dynamic model for the conditional mean and conditional variance of the latent process, which is assumed to be normally distributed, and for the conditional intensity of the discrete process {n_t}, which follows a negative binomial density function. Under these assumptions, together with the densities of order statistics, we obtain maximum likelihood estimates of the parameters of the model, which are needed to estimate the expected value of the bounds of the interval. We implement this approach with the time series of livestock prices, of which only low/high prices are recorded making the price process itself a latent process. We find that the proposed model provides an excellent fit of the intervals of low/high returns with an average coverage rate of 83%. We also offer a comparison with current models for interval-valued data.
    Date: 2015–05
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:201505&r=ets

This nep-ets issue is ©2015 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.