
on Econometric Time Series 
By:  Todd E. Clark; Francesco Ravazzolo 
Abstract:  This paper compares alternative models of timevarying macroeconomic volatility on the basis of the accuracy of point and density forecasts of macroeconomic variables. In this analysis, we consider both Bayesian autoregressive and Bayesian vector autoregressive models that incorporate some form of timevarying volatility, precisely stochastic volatility (both with constant and timevarying autoregressive coeffi cients), stochastic volatility following a stationary AR process, stochastic volatility coupled with fat tails, GARCH, and mixtureofinnovation models. The comparison is based on the accuracy of forecasts of key macroeconomic time series for realtime post–WarII data both for the United States and United Kingdom. The results show that the AR and VAR specifications with widely used stochastic volatility dominate models with alternative volatility specifications, in terms of point forecasting to some degree and density forecasting to a greater degree. 
Keywords:  Simulation modeling ; Economic forecasting ; Bayesian statistical decision theory 
Date:  2012 
URL:  http://d.repec.org/n?u=RePEc:fip:fedcwp:1218&r=ets 
By:  Ioannis Kasparis (Dept. of Economics, University of Cyprus); Elena Andreou (Dept. of Economics, University of Cyprus); Peter C.B. Phillips (Cowles Foundation, Yale University) 
Abstract:  A unifying framework for inference is developed in predictive regressions where the predictor has unknown integration properties and may be stationary or nonstationary. Two easily implemented nonparametric Ftests are proposed. The test statistics are related to those of Kasparis and Phillips (2012) and are obtained by kernel regression. The limit distribution of these predictive tests holds for a wide range of predictors including stationary as well as nonstationary fractional and near unit root processes. In this sense the proposed tests provide a unifying framework for predictive inference, allowing for possibly nonlinear relationships of unknown form, and offering robustness to integration order and functional form. Under the null of no predictability the limit distributions of the tests involve functionals of independent chi^2 variates. The tests are consistent and divergence rates are faster when the predictor is stationary. Asymptotic theory and simulations show that the proposed tests are more powerful than existing parametric predictability tests when deviations from unity are large or the predictive regression is nonlinear. Some empirical illustrations to monthly SP500 stock returns data are provided. 
Keywords:  Functional regression, Nonparametric predictability test, Nonparametric regression, Stock returns, Predictive regression 
JEL:  C22 C32 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1878&r=ets 
By:  Peter C.B. Phillips (Cowles Foundation, Yale University) 
Abstract:  A prominent use of local to unity limit theory in applied work is the construction of confidence intervals for autogressive roots through inversion of the ADF t statistic associated with a unit root test, as suggested in Stock (1991). Such confidence intervals are valid when the true model has an autoregressive root that is local to unity (rho = 1 + (c/n)) but are invalid at the limits of the domain of definition of the localizing coefficient c because of a failure in tightness and the escape of probability mass. Consideration of the boundary case shows that these confidence intervals are invalid for stationary autoregression where they manifest locational bias and width distortion. In particular, the coverage probability of these intervals tends to zero as c approaches infinity, and the width of the intervals exceeds the width of intervals constructed in the usual way under stationarity. Some implications of these results for predictive regression tests are explored. It is shown that when the regressor has autoregressive coefficient rho < 1 and the sample size n approaches infinity, the Campbell and Yogo (2006) confidence intervals for the regression coefficient have zero coverage probability asymptotically and their predictive test statistic Q erroneously indicates predictability with probability approaching unity when the null of no predictability holds. These results have obvious implications for empirical practice. 
Keywords:  Autoregressive root, Confidence belt, Confidence interval, Coverage probability, Local to unity, Localizing coefficient, Predictive regression, Tightness 
JEL:  C22 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1879&r=ets 
By:  Yang Yan; Dajing Shang; Oliver Linton (Institute for Fiscal Studies and Cambridge University) 
Abstract:  This paper proposes efficient estimators of risk measures in a semiparametric GARCH model defined through moment constraints. Moment constraints are often used to identify and estimate the mean and variance parameters and are however discarded when estimating error quantiles. In order to prevent this efficiency loss in quantile estimation we propose a quantile estimator based on inverting an empirical likelihood weighted distribution estimator. It is found that the new quantile estimator is uniformly more efficient than the simple empirical quantile and a quantile estimator based on normalized residuals. At the same time, the efficiency gain in error quantile estimation hingeson the efficiency of estimators of the variance parameters. We show that the same conclusion applies to the estimation of conditional Expected Shortfall. Our comparison also leads to interesting implications of residual bootstrap for dynamic models. We find that these proposed estimators for conditional ValueatRisk and expected shortfall are asymptotically mixed normal. This asymptotic theory can be used to construct confidence bands for these estimators by taking account of parameter uncertainty. Simulation evidence as well as empirical results are provided. 
Keywords:  Empirical Likelihood; Empirical process; GARCH; Quantile; ValueatRisk; Expected Shortfall. 
JEL:  C14 C22 G22 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:25/12&r=ets 
By:  Degui Li; Oliver Linton (Institute for Fiscal Studies and Cambridge University); Zudi Lu 
Abstract:  We consider approximating a multivariate regression function by an affine combination of onedimensional conditional component regression functions. The weight parameters involved in the approximation are estimated by least squares on the firststage nonparametric kernel estimates. We establish asymptotic normality for the estimated weights and the regression function in two cases: the number of the covariates is finite, and the number of the covariates is diverging. As the observations are assumed to be stationary and near epoch dependent, the approach in this paper is applicable to estimation and forecasting issues in time series analysis. Furthermore, the methods and results are augmented by a simulation study and illustrated by application in the analysis of the Australian annual mean temperature anomaly series. We also apply our methods to high frequency volatility forecasting, where we obtain superior results to parametric methods. 
Keywords:  Asymptotic normality, model averaging, NadarayaWatson kernel estimation, near epoch dependence, semiparametric method. 
JEL:  C14 C22 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:ifs:cemmap:28/12&r=ets 
By:  Sebastien Valeyre; Denis Grebenkov; Sofiane Aboura; Qian Liu 
Abstract:  We present a new volatility model, simple to implement, that combines various attractive features such as an exponential moving average of the price and a leverage effect. This model is able to capture the socalled "panic effect", which occurs whenever systematic risk becomes the dominant factor. consequently, in contrast to other models, this new model is as reactive as the implied volatility indices. We also test the reactivity of our model using extreme events taken from the 470 most liquid European stocks over the last decade. We show that the reactive volatility model is more robust to extreme events, and it allows for the identification of precursors and replicas of extreme events. 
Date:  2012–09 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1209.5190&r=ets 
By:  Christan Francq (Crest and University Lille 3); JeanMichel Zakoian (Crest and University Lille 3) 
Abstract:  In conditionally heteroskedastic models, the optimal prediction of powers, or logarithms, of the absolute value has a simple expression in terms of the volatility and an expectation involving the independent process. A natural procedure for estimating this prediction is to estimate the volatility in a first step, for instance by Gaussian quasimaximum likelihood (QML) or by leastabsolute deviations, and to use empirical means based on rescaled innovations to estimate the expectation in a second step. This paper proposes an alternative onestep procedure, based on an appropriate nonGaussian QML estimator, and establishes the asymptotic properties of the two approaches. Asymptotic comparisons and numerical experiments show that the differences in accuracy can be important, depending on the prediction problem and the innovations distribution. An application to indexes of major stock exchanges is given 
Keywords:  Efficiency of estimators, GARCH, Leastabsolute deviations estimation, Prediction, Quasi maximum likelihood estimation 
Date:  2012–08 
URL:  http://d.repec.org/n?u=RePEc:crs:wpaper:201217&r=ets 