nep-ets New Economics Papers
on Econometric Time Series
Issue of 2012‒01‒03
nine papers chosen by
Yong Yin
SUNY at Buffalo

  1. Coherent Model-Free Implied Volatility: A Corridor Fix for High-Frequency VIX By Torben G. Andersen; Oleg Bondarenko; Maria T. Gonzalez-Perez
  2. Are Bayesian Fan Charts Useful for Central Banks? Uncertainty, Forecasting, and Financial Stability Stress Tests By Michal Franta; Jozef Barunik; Roman Horvath; Katerina Smidkova
  3. Second-order moments of frequency asymmetric cycles By Miguel Artiach
  4. Marginal Likelihood for Markov-Switching and Change-Point Garch Models By Luc Bauwens; Arnaud Dufays; Jeroen Rombouts
  5. Lag Length Selection for Unit Root Tests in the Presence of Nonstationary Volatility By Cavaliere Giuseppe; Phillips Peter C.B.; Smeekes Stephan; Taylor A.M. Robert
  6. Optimal Forecasts in the Presence of Structural Breaks By M Hashem Pesaran; Andreas Pick; Mikhail Pranovich
  7. An application of the method of moments to volatility estimation using daily high, low, opening and closing prices By Cristin Buescu; Michael Taksar; Fatoumata J. Kon\'e
  8. Spurious trend switching phenomena in financial markets By Vladimir Filimonov; Didier Sornette
  9. Testing for Trend in the Presence of Autoregressive Error: A Comment By Pierre Perron; Tomoyoshi Yabu

  1. By: Torben G. Andersen (Kellogg School of Management; Northwestern University and CREATES); Oleg Bondarenko (Department of Finance (MC 168), University of Illinois at Chicago); Maria T. Gonzalez-Perez (Colegio Universitario de Estudios Financieros (CUNEF))
    Abstract: The VIX index is computed as a weighted average of SPX option prices over a range of strikes according to specific rules regarding market liquidity. It is explicitly designed to provide a model-free option-implied volatility measure. Using tick-by-tick observations on the underlying options, we document a substantial time variation in the coverage which the stipulated strike range affords for the distribution of future S&P 500 index prices. This produces idiosyncratic biases in the measure, distorting the time series properties of VIX. We introduce a novel “Corridor Implied Volatility” index (CX) computed from a strike range covering an “economically invariant” proportion of the future S&P 500 index values. We find the CX measure superior in filtering out noise and eliminating artificial jumps, thus providing a markedly different characterization of the high-frequency volatility dynamics. Moreover, the VIX measure is particularly unreliable during periods of market stress, exactly when a “fear gauge” is most valuable.
    Keywords: VIX, Model-Free Implied Volatility, Corridor Implied Volatility, Time Series Coherence
    JEL: G13 C58
    Date: 2011–11–30
  2. By: Michal Franta; Jozef Barunik; Roman Horvath; Katerina Smidkova
    Abstract: This paper shows how fan charts generated from Bayesian vector autoregression (BVAR) models can be useful for assessing 1) the forecasting accuracy of central banks’ prediction models and 2) the credibility of stress tests carried out to evaluate financial stability. Using unique data from the Czech National Bank (CNB), we compare our BVAR fan charts for inflation, GDP growth, interest rate and the exchange rate to those of the CNB, which are based on past forecasting errors. Our results suggest that in terms of the Kullback-Leibler Information Criterion, BVAR fan charts typically do not outperform those of the CNB, providing a useful cross-check of their accuracy. However, we show how BVAR fan charts can rigorously deal with the non-negativity constraint on the nominal interest rate and usefully complement the official fan charts. Finally, we put forward how BVAR fan charts can be useful for assessing financial stability and propose a simple method for evaluating whether the assumptions of banks’ stress tests about the macroeconomic outlook are sufficiently adverse.
    Keywords: Bayesian vector autoregression, fan chart, inflation targeting, stress tests, uncertainty.
    JEL: E52 E58
    Date: 2011–11
  3. By: Miguel Artiach (Dpto. Fundamentos del Análisis Económico)
    Abstract: Second-order moments, as even functions in time, are conventionally regarded as containing no information about the time irreversible nature of a sequence and therefore about its frequency asymmetry. However, this paper shows that the frequency asymmetry produces a clearly distinct behaviour in second-order moments that can be observed in both the time domain and the frequency domain. In addition, a frequency domain method of estimation of the differing lengths of the recessionary and expansionary stages of a cycle is proposed and its finite sample performance evaluated. Finally, the asymmetric patterns in the waves of the US unemployment rate and in the sunspot index are analysed.
    Keywords: frequency asymmetry, time irreversibility, periodogram, correlogram, business cycle
    JEL: C13 C22 E27
    Date: 2011–12
  4. By: Luc Bauwens; Arnaud Dufays; Jeroen Rombouts
    Abstract: GARCH volatility models with fixed parameters are too restrictive for long time series due to breaks in the volatility process. Flexible alternatives are Markov-switching GARCH and change-point GARCH models. They require estimation by MCMC methods due to the path dependence problem. An unsolved issue is the computation of their marginal likelihood, which is essential for determining the number of regimes or change-points. We solve the problem by using particle MCMC, a technique proposed by Andrieu, Doucet, and Holenstein (2010). We examine the performance of this new method on simulated data, and we illustrate its use on several return series. <P>
    Keywords: C11, C15, C22, C58,
    JEL: C11 C15 C22 C58
    Date: 2011–11–01
  5. By: Cavaliere Giuseppe; Phillips Peter C.B.; Smeekes Stephan; Taylor A.M. Robert (METEOR)
    Abstract: A number of recently published papers have focused on the problem of testing for a unit root inthe case where the driving shocks may be unconditionally heteroskedastic. These papers have,however, assumed that the lag length in the unit root test regression is a deterministic functionof the sample size, rather than data-determined, the latter being standard empirical practice. Inthis paper we investigate the finite sample impact of unconditional heteroskedasticity onconventional data-dependent methods of lag selection in augmented Dickey-Fuller type unit roottest regressions and propose new lag selection criteria which allow for the presence ofheteroskedasticity in the shocks. We show that standard lag selection methods show a tendency toover-fit the lag order under heteroskedasticity, which results in significant power losses in the(wild bootstrap implementation of the) augmented Dickey-Fuller tests under the alternative. Thenew lag selection criteria we propose are shown to avoid this problem yet deliver unit root testswith almost identical finite sample size and power properties as the corresponding tests based onconventional lag selection methods when the shocks are homoskedastic.
    Keywords: econometrics;
    Date: 2011
  6. By: M Hashem Pesaran; Andreas Pick; Mikhail Pranovich
    Abstract: This paper considers the problem of forecasting under continuous and discrete structural breaks and proposes weighting observations to obtain optimal forecasts in the MSFE sense. We derive optimal weights for continuous and discrete break processes. Under continuous breaks, our approach recovers exponential smoothing weights. Under discrete breaks, we provide analytical expressions for the weights in models with a single regressor and asymptotically for larger models. It is shown that in these cases the value of the optimal weight is the same across observations within a given regime and differs only across regimes. In practice, where information on structural breaks is uncertain a forecasting procedure based on robust weights is proposed. Monte Carlo experiments and an empirical application to the predictive power of the yield curve analyze the performance of our approach relative to other forecasting methods.
    Keywords: Forecasting; structural breaks; optimal weights; robust weights; exponential smoothing
    JEL: C22 C53
    Date: 2011–12
  7. By: Cristin Buescu; Michael Taksar; Fatoumata J. Kon\'e
    Abstract: We use the expectation of the range of an arithmetic Brownian motion and the method of moments on the daily high, low, opening and closing prices to estimate the volatility of the stock price. The daily price jump at the opening is considered to be the result of the unobserved evolution of an after-hours virtual trading day.The annualized volatility is used to calculate Black-Scholes prices for European options, and a trading strategy is devised to profit when these prices differ flagrantly from the market prices.
    Date: 2011–12
  8. By: Vladimir Filimonov; Didier Sornette
    Abstract: The observation of power laws in the time to extrema of volatility, volume and intertrade times, from milliseconds to years, are shown to result straightforwardly from the selection of biased statistical subsets of realizations in otherwise featureless processes such as random walks. The bias stems from the selection of price peaks that imposes a condition on the statistics of price change and of trade volumes that skew their distributions. For the intertrade times, the extrema and power laws results from the format of transaction data.
    Date: 2011–12
  9. By: Pierre Perron (Department of Economics, Boston University); Tomoyoshi Yabu (Faculty of Business and Commerce, Keio University)
    Abstract: Roy, Falk and Fuller (2004) presented a procedure aimed at providing a test for the value of the slope of a trend function that has (nearly) controlled size in autoregressive models whether the noise component is stationary or has a unit root. In this note, we document errors in both their theoretical results and the simulations they reported. Once these are corrected for, their procedure delivers a test that has very liberal size in the case with a unit root so that the stated goal is not achieved. Interestingly, the mistakes in the code used to generate the simulated results (which is the basis for the evidence about the reliability of the method) are such that what they report is essentially equivalent to the size and power of the test proposed by Perron and Yabu (2009), which was shown to have the standard Normal distribution whether the noise is stationary or has a unit root.
    Date: 2011–10

This nep-ets issue is ©2012 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.