nep-ets New Economics Papers
on Econometric Time Series
Issue of 2014‒11‒07
ten papers chosen by
Yong Yin
SUNY at Buffalo

  1. Bayesian Tail Risk Forecasting using Realised GARCH By Contino, Christian; Gerlach, Richard H.
  2. Bootstrapping integrated covariance matrix estimators in noisy jump-diffusion models with non-synchronous trading By Ulrich Hounyo
  3. Estimating Dynamic Equilibrium Models with Stochastic Volatility By Jesús Fernández-Villaverde; Pablo Guerrón-Quintana; Juan F. Rubio-Ramírez
  4. Evaluating Conditional Forecasts from Vector Autoregressions By Clark, Todd E.; McCracken, Michael W.
  5. Log versus Level in VAR Forecasting: 42 Million Empirical Answers - Expect the Unexpected By Johannes Mayr; Dirk Ulbricht
  6. Model uncertainty in panel vector autoregressive models By Gary Koop; Dimitris Korobilis
  7. Probability density of the wavelet coefficients of a noisy chaos By Matthieu Garcin; Dominique Guegan
  8. Random switching exponential smoothing and inventory forecasting By Giacomo Sbrana; Andrea Silvestrini
  9. Real-Time Factor Model Forecasting and the Effects of Instability By Michael P. Clements
  10. Specification Tests for Nonlinear Dynamic Models By Igor Kheifets

  1. By: Contino, Christian; Gerlach, Richard H.
    Abstract: A Realised Volatility GARCH model is developed within a Bayesian framework for the purpose of forecasting Value at Risk and Conditional Value at Risk. Student-t and Skewed Student-t return distributions are combined with Gaussian and Student-t distributions in the measurement equation in a GARCH framework to forecast tail risk in eight international equity index markets over a four year period. Three Realised Volatility proxies are considered within this framework. Realised Volatility GARCH models show a marked improvement compared to ordinary GARCH for both Value at Risk and Conditional Value at Risk forecasting. This improvement is consistent across a variety of data, volatility model speci_cations and distributions, and demonstrates that Realised Volatility is superior when producing volatility forecasts. Realised Volatility models implementing a Skewed Student-t distribution for returns in the GARCH equation are favoured.
    Keywords: Risk Management; Expected Shortfall; High-Frequency Data; CVaR; Value-at-Risk; GARCH; Realised Volatility
    Date: 2014–10–10
    URL: http://d.repec.org/n?u=RePEc:syb:wpbsba:2123/12060&r=ets
  2. By: Ulrich Hounyo (Oxford-Man Institute, University of Oxford, and Aarhus University and CREATES)
    Abstract: We propose a bootstrap method for estimating the distribution (and functionals of it such as the variance) of various integrated covariance matrix estimators. In particular, we first adapt the wild blocks of blocks bootstrap method suggested for the pre-averaged realized volatility estimator to a general class of estimators of integrated covolatility. We then show the first-order asymptotic validity of this method in the multivariate context with a potential presence of jumps, dependent microstructure noise, irregularly spaced and non-synchronous data. Due to our focus on nonstudentized statistics, our results justify using the bootstrap to estimate the covariance matrix of a broad class of covolatility estimators. The bootstrap variance estimator is positive semi-definite by construction, an appealing feature that is not always shared by existing variance estimators of the integrated covariance estimator. As an application of our results, we also consider the bootstrap for regression coefficients. We show that the wild blocks of blocks bootstrap, appropriately centered, is able to mimic both the dependence and heterogeneity of the scores, thus justifying the construction of bootstrap percentile intervals as well as variance estimates in this context. This contrasts with the traditional pairs bootstrap which is not able to mimic the score heterogeneity even in the simple case where no microstructure noise is present. Our Monte Carlo simulations show that the wild blocks of blocks bootstrap improves the finite sample properties of the existing first-order asymptotic theory. We illustrate its practical use on high-frequency equity data.
    Keywords: High-frequency data, market microstructure noise, non-synchronous data, jumps, realized measures, integrated covariance, wild bootstrap, block bootstrap
    JEL: C15 C22 C58
    Date: 2014–10–07
    URL: http://d.repec.org/n?u=RePEc:aah:create:2014-35&r=ets
  3. By: Jesús Fernández-Villaverde; Pablo Guerrón-Quintana; Juan F. Rubio-Ramírez
    Abstract: This paper develops a particle filtering algorithm to estimate dynamic equilibrium models with stochastic volatility using a likelihood-based approach. The algorithm, which exploits the structure and profusion of shocks in stochastic volatility models, is versatile and computationally tractable even in large-scale models. As an application, we use our algorithm and Bayesian methods to estimate a business cycle model of the U.S. economy with both stochastic volatility and parameter drifting in monetary policy. Our application shows the importance of stochastic volatility in accounting for the dynamics of the data.
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:fda:fdaddt:2014-11&r=ets
  4. By: Clark, Todd E. (Federal Reserve Bank of Cleveland); McCracken, Michael W. (Federal Reserve Bank of St. Louis)
    Abstract: Many forecasts are conditional in nature. For example, a number of central banks routinely report forecasts conditional on particular paths of policy instruments. Even though conditional forecasting is common, there has been little work on methods for evaluating conditional forecasts. This paper provides analytical,Monte Carlo, and empirical evidence on tests of predictive ability for conditional forecasts from estimated models. In the empirical analysis, we consider forecasts of growth, unemployment, and inflation from a VAR, based on conditions on the short-term interest rate. Throughout the analysis, we focus on tests of bias, efficiency, and equal accuracy applied to conditional forecasts from VAR models.
    Keywords: Prediction; forecasting; out-of-sample
    JEL: C12 C32 C52 C53
    Date: 2014–10–02
    URL: http://d.repec.org/n?u=RePEc:fip:fedcwp:1413&r=ets
  5. By: Johannes Mayr; Dirk Ulbricht
    Abstract: The use of log-transformed data has become standard in macroeconomic forecasting with VAR models. However, its appropriateness in the context of out-of-sample forecasts has not yet been exposed to a thorough empirical investigation. With the aim of filling this void, a broad sample of VAR models is employed in a multi-country set up and approximately 42 Mio. pseudo-out-of-sample forecasts of GDP are evaluated. The results show that, on average, the knee-jerk transformation of the data is at best harmless.
    Keywords: VAR-forecasting, Logarithmic transformation
    JEL: C52 C53
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1412&r=ets
  6. By: Gary Koop; Dimitris Korobilis
    Abstract: We develop methods for Bayesian model averaging (BMA) or selection (BMS) in Panel Vector Autoregressions (PVARs). Our approach allows us to select between or average over all possible combinations of restricted PVARs where the restrictions involve interdependencies between and heterogeneities across cross-sectional units. The resulting BMA framework can find a parsi- monious PVAR specification, thus dealing with overparameterization concerns. We use these methods in an application involving the euro area sovereign debt crisis and show that our methods perform better than alternatives. Our findings contradict a simple view of the sovereign debt crisis which divides the euro zone into groups of core and peripheral countries and worries about financial contagion within the latter group.
    Keywords: Bayesian model averaging, stochastic search variable selection, financial contagion, sovereign debt crisis
    JEL: C11 C33 C52 G10
    Date: 2014–08
    URL: http://d.repec.org/n?u=RePEc:gla:glaewp:2014_10&r=ets
  7. By: Matthieu Garcin (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon-Sorbonne); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon-Sorbonne, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: We are interested in the random wavelet coefficients of a noisy signal when this signal is the unidimensional or multidimensional attractor of a chaos. More precisely we give an expression for the probability density of such coefficients. If the noise is a dynamic noise, then our expression is exact. If we face a measurement noise, then we propose two approximations using Taylor expansion or Edgeworth expansion. We give some illustrations of these theoretical results for the logistic map, the tent map and the Hénon map, perturbed by a Gaussian or a Cauchy noise.
    Keywords: Wavelets; dynamical systems; chaos, noise; Alpha-stable
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:hal:journl:hal-00800997&r=ets
  8. By: Giacomo Sbrana (NEOMA Business School); Andrea Silvestrini (Bank of Italy, Economic Research Department)
    Abstract: Exponential smoothing models are an important prediction tool in macroeconomics, finance and business. This paper presents the analytical forecasting properties of the random coefficient exponential smoothing model in the multiple source of error framework. The random coefficient state-space representation allows for switching between simple exponential smoothing and the local linear trend. Therefore it is possible to control, in a flexible manner, the random changing dynamic behaviour of the time series. The paper establishes the algebraic mapping between the state-space parameters and the implied reduced form ARIMA parameters. In addition, it shows that parametric mapping surmounts the difficulties that are likely to emerge in a direct estimatation of the random coefficient state-space model. Finally, it presents an empirical application comparing the forecast accuracy of the suggested model vis-à-vis other benchmark models, both in the ARIMA and in the Exponential Smoothing class. Using time series relative to wholesalers’ inventories in the USA, the out-of-sample results show that the reduced form of the random coefficient exponential smoothing model tends to be superior to its competitors.
    Keywords: exponential smoothing, ARIMA, inventory, forecasting.
    Date: 2014–07
    URL: http://d.repec.org/n?u=RePEc:bdi:wptemi:td_971_14&r=ets
  9. By: Michael P. Clements (ICMA Centre, Henley Business School, University of Reading)
    Abstract: We show that factor forecasting models deliver real-time gains over autoregressive models for US real activity variables during the recent period, but are less successful for nominal variables. The gains are largely due to the Financial Crisis period, and are primarily at the shortest (one quarter ahead) horizon. Excluding the pre-Great Moderation years from the factor forecasting model estimation period (but not from the data used to extract factors) results in a marked fillip in factor model forecast accuracy, but does the same for the AR model forecasts. The relative performance of the factor models compared to the AR models is largely unaffected by whether the exercise is in real time or is pseudo out-of-sample.
    Keywords: Factor Models, Robust Approaches, Financial Crisis
    JEL: C51 C22
    Date: 2014–05
    URL: http://d.repec.org/n?u=RePEc:rdg:icmadp:icma-dp2014-05&r=ets
  10. By: Igor Kheifets (New Economic School, Moscow)
    Abstract: We propose a new adequacy test and a graphical evaluation tool for nonlinear dynamic models. The proposed techniques can be applied in any setup where parametric conditional distribution of the data is specified, in particular to models involving conditional volatility, conditional higher moments, conditional quantiles, asymmetry, Value at Risk models, duration models, diffusion models, etc. Compared to other tests, the new test properly controls the nonlinear dynamic behavior in conditional distribution and does not rely on smoothing techniques which require a choice of several tuning parameters. The test is based on a new kind of multivariate empirical process of contemporaneous and lagged probability integral transforms. We establish weak convergence of the process under parameter uncertainty and local alternatives. We justify a parametric bootstrap approximation that accounts for parameter estimation effects often ignored in practice. Monte Carlo experiments show that the test has good finite-sample size and power properties. Using the new test and graphical tools we check the adequacy of various popular heteroscedastic models for stock exchange index data.
    Keywords: Conditional distribution, Time series, Goodness-of-fit, Empirical process, Weak convergence, Parameter uncertainty, Probability integral transform
    Date: 2014–10
    URL: http://d.repec.org/n?u=RePEc:cfr:cefirw:w0209&r=ets

This nep-ets issue is ©2014 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.