nep-ets New Economics Papers
on Econometric Time Series
Issue of 2012‒10‒06
seven papers chosen by
Yong Yin
SUNY at Buffalo

  1. The macroeconomic forecasting performance of autoregressive models with alternative specifications of time-varying volatility By Todd E. Clark; Francesco Ravazzolo
  2. Nonparametric Predictive Regression By Ioannis Kasparis; Elena Andreou; Peter C.B. Phillips
  3. On Confidence Intervals for Autoregressive Roots and Predictive Regression By Peter C.B. Phillips
  4. Efficient estimation of conditional risk measures in a semiparametric GARCH model By Yang Yan; Dajing Shang; Oliver Linton
  5. A flexible semiparametric model for time series By Degui Li; Oliver Linton; Zudi Lu
  6. The Reactive Volatility Model By Sebastien Valeyre; Denis Grebenkov; Sofiane Aboura; Qian Liu
  7. Optimal Predictions of Powers of Conditionally Heteroskedastic Processes By Christan Francq; Jean-Michel Zakoian

  1. By: Todd E. Clark; Francesco Ravazzolo
    Abstract: This paper compares alternative models of time-varying macroeconomic volatility on the basis of the accuracy of point and density forecasts of macroeconomic variables. In this analysis, we consider both Bayesian autoregressive and Bayesian vector autoregressive models that incorporate some form of time-varying volatility, precisely stochastic volatility (both with constant and time-varying autoregressive coeffi cients), stochastic volatility following a stationary AR process, stochastic volatility coupled with fat tails, GARCH, and mixture-of-innovation models. The comparison is based on the accuracy of forecasts of key macroeconomic time series for real-time post–War-II data both for the United States and United Kingdom. The results show that the AR and VAR specifications with widely used stochastic volatility dominate models with alternative volatility specifications, in terms of point forecasting to some degree and density forecasting to a greater degree.
    Keywords: Simulation modeling ; Economic forecasting ; Bayesian statistical decision theory
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:fip:fedcwp:1218&r=ets
  2. By: Ioannis Kasparis (Dept. of Economics, University of Cyprus); Elena Andreou (Dept. of Economics, University of Cyprus); Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: A unifying framework for inference is developed in predictive regressions where the predictor has unknown integration properties and may be stationary or nonstationary. Two easily implemented nonparametric F-tests are proposed. The test statistics are related to those of Kasparis and Phillips (2012) and are obtained by kernel regression. The limit distribution of these predictive tests holds for a wide range of predictors including stationary as well as non-stationary fractional and near unit root processes. In this sense the proposed tests provide a unifying framework for predictive inference, allowing for possibly nonlinear relationships of unknown form, and offering robustness to integration order and functional form. Under the null of no predictability the limit distributions of the tests involve functionals of independent chi^2 variates. The tests are consistent and divergence rates are faster when the predictor is stationary. Asymptotic theory and simulations show that the proposed tests are more powerful than existing parametric predictability tests when deviations from unity are large or the predictive regression is nonlinear. Some empirical illustrations to monthly SP500 stock returns data are provided.
    Keywords: Functional regression, Nonparametric predictability test, Nonparametric regression, Stock returns, Predictive regression
    JEL: C22 C32
    Date: 2012–09
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1878&r=ets
  3. By: Peter C.B. Phillips (Cowles Foundation, Yale University)
    Abstract: A prominent use of local to unity limit theory in applied work is the construction of confidence intervals for autogressive roots through inversion of the ADF t statistic associated with a unit root test, as suggested in Stock (1991). Such confidence intervals are valid when the true model has an autoregressive root that is local to unity (rho = 1 + (c/n)) but are invalid at the limits of the domain of definition of the localizing coefficient c because of a failure in tightness and the escape of probability mass. Consideration of the boundary case shows that these confidence intervals are invalid for stationary autoregression where they manifest locational bias and width distortion. In particular, the coverage probability of these intervals tends to zero as c approaches -infinity, and the width of the intervals exceeds the width of intervals constructed in the usual way under stationarity. Some implications of these results for predictive regression tests are explored. It is shown that when the regressor has autoregressive coefficient |rho| < 1 and the sample size n approaches infinity, the Campbell and Yogo (2006) confidence intervals for the regression coefficient have zero coverage probability asymptotically and their predictive test statistic Q erroneously indicates predictability with probability approaching unity when the null of no predictability holds. These results have obvious implications for empirical practice.
    Keywords: Autoregressive root, Confidence belt, Confidence interval, Coverage probability, Local to unity, Localizing coefficient, Predictive regression, Tightness
    JEL: C22
    Date: 2012–09
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1879&r=ets
  4. By: Yang Yan; Dajing Shang; Oliver Linton (Institute for Fiscal Studies and Cambridge University)
    Abstract: This paper proposes efficient estimators of risk measures in a semiparametric GARCH model defined through moment constraints. Moment constraints are often used to identify and estimate the mean and variance parameters and are however discarded when estimating error quantiles. In order to prevent this efficiency loss in quantile estimation we propose a quantile estimator based on inverting an empirical likelihood weighted distribution estimator. It is found that the new quantile estimator is uniformly more efficient than the simple empirical quantile and a quantile estimator based on normalized residuals. At the same time, the efficiency gain in error quantile estimation hingeson the efficiency of estimators of the variance parameters. We show that the same conclusion applies to the estimation of conditional Expected Shortfall. Our comparison also leads to interesting implications of residual bootstrap for dynamic models. We find that these proposed estimators for conditional Value-at-Risk and expected shortfall are asymptotically mixed normal. This asymptotic theory can be used to construct confidence bands for these estimators by taking account of parameter uncertainty. Simulation evidence as well as empirical results are provided.
    Keywords: Empirical Likelihood; Empirical process; GARCH; Quantile; Value-at-Risk; Expected Shortfall.
    JEL: C14 C22 G22
    Date: 2012–09
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:25/12&r=ets
  5. By: Degui Li; Oliver Linton (Institute for Fiscal Studies and Cambridge University); Zudi Lu
    Abstract: We consider approximating a multivariate regression function by an affine combination of one-dimensional conditional component regression functions. The weight parameters involved in the approximation are estimated by least squares on the first-stage nonparametric kernel estimates. We establish asymptotic normality for the estimated weights and the regression function in two cases: the number of the covariates is finite, and the number of the covariates is diverging. As the observations are assumed to be stationary and near epoch dependent, the approach in this paper is applicable to estimation and forecasting issues in time series analysis. Furthermore, the methods and results are augmented by a simulation study and illustrated by application in the analysis of the Australian annual mean temperature anomaly series. We also apply our methods to high frequency volatility forecasting, where we obtain superior results to parametric methods.
    Keywords: Asymptotic normality, model averaging, Nadaraya-Watson kernel estimation, near epoch dependence, semiparametric method.
    JEL: C14 C22
    Date: 2012–09
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:28/12&r=ets
  6. By: Sebastien Valeyre; Denis Grebenkov; Sofiane Aboura; Qian Liu
    Abstract: We present a new volatility model, simple to implement, that combines various attractive features such as an exponential moving average of the price and a leverage effect. This model is able to capture the so-called "panic effect", which occurs whenever systematic risk becomes the dominant factor. consequently, in contrast to other models, this new model is as reactive as the implied volatility indices. We also test the reactivity of our model using extreme events taken from the 470 most liquid European stocks over the last decade. We show that the reactive volatility model is more robust to extreme events, and it allows for the identification of precursors and replicas of extreme events.
    Date: 2012–09
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1209.5190&r=ets
  7. By: Christan Francq (Crest and University Lille 3); Jean-Michel Zakoian (Crest and University Lille 3)
    Abstract: In conditionally heteroskedastic models, the optimal prediction of powers, or logarithms, of the absolute value has a simple expression in terms of the volatility and an expectation involving the independent process. A natural procedure for estimating this prediction is to estimate the volatility in a first step, for instance by Gaussian quasi-maximum likelihood (QML) or by least-absolute deviations, and to use empirical means based on rescaled innovations to estimate the expectation in a second step. This paper proposes an alternative one-step procedure, based on an appropriate non-Gaussian QML estimator, and establishes the asymptotic properties of the two approaches. Asymptotic comparisons and numerical experiments show that the differences in accuracy can be important, depending on the prediction problem and the innovations distribution. An application to indexes of major stock exchanges is given
    Keywords: Efficiency of estimators, GARCH, Least-absolute deviations estimation, Prediction, Quasi maximum likelihood estimation
    Date: 2012–08
    URL: http://d.repec.org/n?u=RePEc:crs:wpaper:2012-17&r=ets

This nep-ets issue is ©2012 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.