nep-ets New Economics Papers
on Econometric Time Series
Issue of 2014‒12‒08
four papers chosen by
Yong Yin
SUNY at Buffalo

  1. Assessing Point Forecast Accuracy by Stochastic Error Distance By Francis X. Diebold; Minchul Shin
  2. Coupling high-frequency data with nonlinear models in multiple-step-ahead forecasting of energy markets' volatility By Jozef Baruník; Tomáš Køehlík
  3. Marginalized predictive likelihood comparisons of linear Gaussian state-space models with applications to DSGE, DSGEVAR, and VAR models By Warne, Anders; Coenen, Günter; Christoffel, Kai
  4. Vector Autoregressions with Parsimoniously Time Varying Parameters and an Application to Monetary Policy By Laurent Callot; Johannes Tang Kristensen

  1. By: Francis X. Diebold (Department of Economics, University of Pennsylvania); Minchul Shin (Department of Economics, University of Pennsylvania)
    Abstract: We propose point forecast accuracy measures based directly on distance of the forecast-error c.d.f. from the unit step function at 0 (\stochastic error distance," or SED). We provide a precise characterization of the relationship between SED and standard predictive loss functions, showing that all such loss functions can be written as weighted SED's. The leading case is absolute-error loss, in which the SED weights are unity, establishing its primacy. Among other things, this suggests shifting attention away from conditional-mean forecasts and toward conditional-median forecasts.
    Keywords: Forecast accuracy, forecast evaluation, absolute-error loss, quadratic loss, squared-error loss
    JEL: C53
    Date: 2014–11–02
    URL: http://d.repec.org/n?u=RePEc:pen:papers:14-038&r=ets
  2. By: Jozef Baruník (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nábreží 6, 111 01 Prague 1, Czech Republic; Institute of Information Theory and Automation, Academy of Sciences of the Czech Republic, Pod Vodarenskou Vezi 4, 182 00, Prague, Czech Republic); Tomáš Køehlík (Institute of Economic Studies, Faculty of Social Sciences, Charles University in Prague, Smetanovo nábreží 6, 111 01 Prague 1, Czech Republic; Institute of Information Theory and Automation, Academy of Sciences of the Czech Republic, Pod Vodarenskou Vezi 4, 182 00, Prague, Czech Republic)
    Abstract: In the past decade, the popularity of realized measures and various linear models for volatility forecasting has attracted attention in the literature on the price variability of energy markets. However, results that would guide practitioners to a specic estimator and model when aiming for the best forecasting accuracy are missing. This paper contributes to the ongoing debate with a comprehensive evaluation of multiple-step-ahead volatility forecasts of energy markets using several popular high-frequency measures and forecasting models. To capture the complex patterns hidden to linear models commonly used to forecast realized volatility, this paper also contributes to the literature by coupling realized measures with articial neural networks as a forecasting tool. Forecasting performance is compared across models as well as realized measures of crude oil, heating oil, and natural gas volatility during three qualitatively distinct periods covering the precrisis period, recent global turmoil of markets in 2008, and the most recent post-crisis period. We conclude that coupling realized measures with articial neural networks results in both statistical and economic gains, reducing the tendency to over-predict volatility uniformly during all tested periods. Our analysis favors the median realized volatility, as it delivers the best performance and is a computationally simple alternative for practitioners.
    Keywords: artificial neural networks, realized volatility, multiple-step-ahead forecasts, energy markets
    JEL: C14 C53 G17
    Date: 2014–09
    URL: http://d.repec.org/n?u=RePEc:fau:wpaper:wp2014_30&r=ets
  3. By: Warne, Anders; Coenen, Günter; Christoffel, Kai
    Abstract: The predictive likelihood is of particular relevance in a Bayesian setting when the purpose is to rank models in a forecast comparison exercise. This paper discusses how the predictive likelihood can be estimated for any subset of the observable variables in linear Gaussian state-space models with Bayesian methods, and proposes to utilize a missing observations consistent Kalman filter in the process of achieving this objective. As an empirical application, we analyze euro area data and compare the density forecast performance of a DSGE model to DSGE-VARs and reduced-form linear Gaussian models.
    Keywords: Bayesian inference,density forecasting,Kalman filter,missing data,Monte Carlo integration,predictive likelihood
    JEL: C11 C32 C52 C53 E37
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:zbw:cfswop:478&r=ets
  4. By: Laurent Callot (VU University Amsterdam, the Tinbergen Institute and CREATES); Johannes Tang Kristensen (University of Southern Denmark and CREATES)
    Abstract: This paper proposes a parsimoniously time varying parameter vector autoregressive model (with exogenous variables, VARX) and studies the properties of the Lasso and adaptive Lasso as estimators of this model. The parameters of the model are assumed to follow parsimonious random walks, where parsimony stems from the assumption that increments to the parameters have a non-zero probability of being exactly equal to zero. By varying the degree of parsimony our model can accommodate constant parameters, an unknown number of structural breaks, or parameters with a high degree of variation. We characterize the finite sample properties of the Lasso by deriving upper bounds on the estimation and prediction errors that are valid with high probability; and asymptotically we show that these bounds tend to zero with probability tending to one if the number of non zero increments grows slower than squareroot T. By simulation experiments we investigate the properties of the Lasso and the adaptive Lasso in settings where the parameters are stable, experience structural breaks, or follow a parsimonious random walk.We use our model to investigate the monetary policy response to inflation and business cycle fluctuations in the US by estimating a parsimoniously time varying parameter Taylor rule.We document substantial changes in the policy response of the Fed in the 1980s and since 2008.
    Keywords: Parsimony, time varying parameters, VAR, structural break, Lasso
    JEL: C01 C13 C32 E52
    Date: 2014–11–04
    URL: http://d.repec.org/n?u=RePEc:aah:create:2014-41&r=ets

This nep-ets issue is ©2014 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.