|
on Econometric Time Series |
By: | Torben G. Andersen (Kellogg School of Management, Northwestern University, Evanston, IL; NBER, Cambridge, MA; and CREATES, Aarhus, Denmark); Luca Benzoni (Federal Reserve Bank of Chicago, Chicago, Illinois, USA.) |
Abstract: | We give an overview of a broad class of models designed to capture stochastic volatility in financial markets, with illustrations of the scope of application of these models to practical finance problems. In a broad sense, this model class includes GARCH, but we focus on a narrower set of specifications in which volatility follows its own random process and is therefore a latent factor. These stochastic volatility specifications fit naturally in the continuous-time finance paradigm, and there- fore serve as a prominent tool for a wide range of pricing and hedging applications. Moreover, the continuous-time paradigm of financial economics is naturally linked with the theory of volatility mod- eling and forecasting, and in particular with the practice of constructing ex-post volatility measures from high-frequency intraday data (realized volatility). One drawback is that in this setting volatility is not measurable with respect to observable information, and this feature complicates estimation and inference. Further, the presence of an additional state variable|volatility|renders the model less tractable from an analytic perspective. New estimation methods, combined with model restrictions that allow for closed-form solutions, make it possible to address these challenges while keeping the model consistent with the main properties of the data. |
Keywords: | Stochastic Volatility, Realized Volatility, Implied Volatility, Options, Volatility Smirk, Volatility Smile, Dynamic Term Structure Models, Affine Models |
JEL: | E43 G12 |
Date: | 2010–02–25 |
URL: | http://d.repec.org/n?u=RePEc:aah:create:2010-10&r=ets |
By: | Francesco Ravazzolo (Norges Bank (Central Bank of Norway)); Shaun P. Vahey |
Abstract: | We propose a methodology for producing forecast densities for economic aggregates based on disaggregate evidence. Our ensemble predictive methodology utilizes a linear mixture of experts framework to combine the forecast densities from potentially many component models. Each component represents the univariate dynamic process followed by a single disaggregate variable. The ensemble produced from these components approximates the many unknown relationships between the disaggregates and the aggregate by using time-varying weights on the component forecast densities. In our application, we use the disaggregate ensemble approach to forecast US Personal Consumption Expenditure in°ation from 1997Q2 to 2008Q1. Our ensemble combining the evidence from 11 disaggregate series outperforms an aggregate autoregressive benchmark, and an aggregate time-varying parameter specification in density forecasting. |
Keywords: | Ensemble forecasting, disaggregates |
JEL: | C11 C32 C53 E37 E52 |
Date: | 2010–03–05 |
URL: | http://d.repec.org/n?u=RePEc:bno:worpap:2010_02&r=ets |
By: | Valter Di Giacinto (Bank of Italy) |
Abstract: | Despite the fact that it provides a potentially useful analytical tool, allowing for the joint modeling of dynamic interdependencies within a group of connected areas, until lately the VAR approach had received little attention in regional science and spatial economic analysis. This paper aims to contribute in this field by dealing with the issues of parameter identification and estimation and of structural impulse response analysis. In particular, there is a discussion of the adaptation of the recursive identification scheme (which represents one of the more common approaches in the time series VAR literature) to a space-time environment. Parameter estimation is subsequently based on the Full Information Maximum Likelihood (FIML) method, a standard approach in structural VAR analysis. As a convenient tool to summarize the information conveyed by regional dynamic multipliers with a specific emphasis on the scope of spatial spillover effects, a synthetic space-time impulse response function (STIR) is introduced, portraying average effects as a function of displacement in time and space. Asymptotic confidence bands for the STIR estimates are also derived from bootstrap estimates of the standard errors. Finally, to provide a basic illustration of the methodology, the paper presents an application of a simple bivariate fiscal model fitted to data for Italian NUTS 2 regions. |
Keywords: | structural VAR model, spatial econometrics, identification, space-time impulse response analysis |
JEL: | C32 C33 R10 |
Date: | 2010–02 |
URL: | http://d.repec.org/n?u=RePEc:bdi:wptemi:td_746_10&r=ets |
By: | David I. Harvey; Stephen J. Leybourne; A. M. Robert Taylor |
Abstract: | In this paper we propose tests for the null hypothesis that a time series process displays a constant level against the alternative that it displays (possibly) multiple changes in level. Our proposed tests are based on functions of appropriately standardized sequences of the differences between sub-sample mean estimates from the series under investigation. The tests we propose differ notably from extant tests for level breaks in the literature in that they are designed to be robust as to whether the process admits an autoregressive unit root (the data are I(1)) or stable autoregressive roots (the data are I(0)). We derive the asymptotic null distributions of our proposed tests, along with representations for their asymptotic local power functions against Pitman drift alternatives under both I(0) and I(1) environments. Associated estimators of the level break fractions are also discussed. We initially outline our procedure through the case of non-trending series, but our analysis is subsequently extended to allow for series which display an underlying linear trend, in addition to possible level breaks. Monte Carlo simulation results are presented which suggest that the proposed tests perform well in small samples, showing good size control under the null, regardless of the order of integration of the data, and displaying very decent power when level breaks occur. |
Keywords: | Level breaks; unit root; moving means; long run variance estimation; robust tests; breakpoint estimation |
Date: | 2010–02 |
URL: | http://d.repec.org/n?u=RePEc:not:notgts:10/01&r=ets |
By: | Stephan Smeekes; A. M. Robert Taylor |
Abstract: | We provide a joint treatment of three major issues that surround testing for a unit root in practice: uncertainty as to whether or not a linear deterministic trend is present in the data, uncertainty as to whether the initial condition of the process is (asymptotically) negligible or not, and the possible presence of nonstationary volatility in the data. Harvey, Leybourne and Taylor (2010, Journal of Econometrics, forthcoming) propose decision rules based on a four-way union of rejections of QD and OLS detrended tests, both with and without allowing for a linear trend, to deal with the first two problems. However, in the presence of nonstationary volatility these test statistics have limit distributions which depend on the form of the volatility process, making tests based on the standard asymptotic critical values invalid. We construct bootstrap versions of the four-way union of rejections test, which, by employing the wild bootstrap, are shown to be asymptotically valid in the presence of nonstationary volatility. These bootstrap union tests therefore allow for a joint treatment of all three of the aforementioned problems. |
Keywords: | Unit root; local trend; initial condition; asymptotic power; union of rejections decision rule; nonstationary volatility; wild bootstrap |
Date: | 2010–02 |
URL: | http://d.repec.org/n?u=RePEc:not:notgts:10/03&r=ets |
By: | David I. Harvey; Stephen J. Leybourne; A. M. Robert Taylor |
Abstract: | In this paper we analyse evidence for level breaks in the price series comprising the NASDAQ-100 index over the period 2001-2007. We make use of a recently developed methodology that allows robust inference regarding the presence of breaks to be drawn irrespective of whether or not a unit root is present in the data, and whether the underlying innovations are normally or non-normally distributed. We find evidence for one or more level breaks in almost half the series considered, suggesting that appropriate allowance for breaks should be made when modelling or forecasting using these data. |
Keywords: | Level breaks; unit root; non-normality; stock prices |
Date: | 2010–02 |
URL: | http://d.repec.org/n?u=RePEc:not:notgts:10/02&r=ets |
By: | Elmar Mertens |
Abstract: | No, not really, since spectral estimators suffer from small sample and misspecification biases just as VARs do. Spectral estimators are no panacea for implementing long-run restrictions. ; In addition, when combining VAR coefficients with non-parametric estimates of the spectral density, care needs to be taken to consistently account for information embedded in the non-parametric estimates about serial correlation in VAR residuals. This paper uses a spectral factorization to ensure a correct representation of the data's variance. But this cannot overcome the fundamental problems of estimating the long-run dynamics of macroeconomic data in samples of typical length. |
Date: | 2010 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedgfe:2010-09&r=ets |
By: | Korobilis, Dimitris |
Abstract: | This paper develops methods for automatic selection of variables in forecasting Bayesian vector autoregressions (VARs) using the Gibbs sampler. In particular, I provide computationally efficient algorithms for stochastic variable selection in generic (linear and nonlinear) VARs. The performance of the proposed variable selection method is assessed in a small Monte Carlo experiment, and in forecasting 4 macroeconomic series of the UK using time-varying parameters vector autoregressions (TVP-VARs). Restricted models consistently improve upon their unrestricted counterparts in forecasting, showing the merits of variable selection in selecting parsimonious models. |
Keywords: | Forecasting; variable selection; time-varying parameters; Bayesian |
JEL: | C32 C53 C52 E37 C11 E47 |
Date: | 2009–12 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:21124&r=ets |
By: | Carey, Alexander |
Abstract: | This paper presents time-series of higher-order volatilities for the S&P 500 and EURUSD. We use a 3-volatility model which accounts for non-normal skewness and kurtosis. The volatilities control the level, slope and curvature of the Black-Scholes implied volatility smile; accordingly we term them "base", "skew" and "smile" volatility. We define instantaneous skewness and kurtosis as simple ratios of the volatilities, and show that when these metrics are held constant, the model is relative sticky-delta. For the S&P 500 in 2008, skew and smile volatility are highly correlated with base volatility. Instantaneous skewness and kurtosis are remarkably stable, including over the market dislocation of the last four months of the year. Daily changes in all three volatilities are correlated with daily returns. For EURUSD in 2006, base and smile volatility are closely correlated, but in contrast to the equity case, skew volatility is independent and changes sign. This change in sign appears to provide advance warning of the two major market moves of the year. However, daily changes in the volatilities are uncorrelated with daily returns. |
Keywords: | higher-order volatility; time series; S&P 500; EURUSD |
JEL: | G13 |
Date: | 2010–01–12 |
URL: | http://d.repec.org/n?u=RePEc:pra:mprapa:21087&r=ets |