nep-ets New Economics Papers
on Econometric Time Series
Issue of 2009‒10‒31
six papers chosen by
Yong Yin
SUNY at Buffalo

  1. Evaluation of Nonlinear time-series models for real-time business cycle analysis of the Euro area By Monica Billio; Laurent Ferrara; Dominique Guegan; Gian Luigi Mazzi
  2. The Weak Instrument Problem of the System GMM Estimator in Dynamic Panel Data Models By Maurice J.G. Bun; Frank Windmeijer
  3. Stochastic volatility By Torben G. Andersen; Luca Benzoni
  4. Nested forecast model comparisons: a new approach to testing equal accuracy By Todd E. Clark; Michael W. McCracken
  5. A blocking and regularization approach to high dimensional realized covariance estimation By Nikolaus Hautsch; Lada M. Kyj; Roel C.A. Oomen
  6. Nonparametric methods for volatility density estimation By Bert van Es; Peter Spreij; Harry van Zanten

  1. By: Monica Billio (Università Ca' Foscari di Venezia - Dipartimento di Scienze Economiche); Laurent Ferrara (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, DGEI-DAMEP - Banque de France); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Panthéon-Sorbonne - Paris I, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Gian Luigi Mazzi (Eurostat - Office Statistique des Communautés Européennes)
    Abstract: In this paper, we aim at assessing Markov-switching and threshold models in their ability to identify turning points of economic cycles. By using vintage data that are updated on a monthly basis, we compare their ability to detect ex-post the occurrence of turning points of the classical business cycle, we evaluate the stability over time of the signal emitted by the models and assess their ability to detect in real-time recession signals. In this respect, we have built an historical vintage database for the Euro area going back to 1970 for two monthly macroeconomic variables of major importance for short-term economic outlook, namely the Industrial Production Index and the Unemployment Rate.
    Keywords: Business cycle, Euro zone, Markov switching model, SETAR model, unemployment, industrial production.
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00423890_v1&r=ets
  2. By: Maurice J.G. Bun (University of Amsterdam); Frank Windmeijer (University of Bristol)
    Abstract: The system GMM estimator for dynamic panel data models combines moment conditions for the model in first differences with moment conditions for the model in levels. It has been shown to improve on the GMM estimator in the first differenced model in terms of bias and root mean squared error. However, we show in this paper that in the covariance stationary panel data AR(1) model the expected values of the concentration parameters in the differenced and levels equations for the cross section at time t are the same when the variances of the individual heterogeneity and idiosyncratic errors are the same. This indicates a weak instrument problem also for the equation in levels. We show that the 2SLS biases relative to that of the OLS biases are then similar for the equations in differences and levels, as are the size distortions of the Wald tests. These results are shown to extend to the panel data GMM estimators.
    Keywords: Dynamic Panel Data; System GMM; Weak Instruments
    JEL: C12 C13 C23
    Date: 2009–10–09
    URL: http://d.repec.org/n?u=RePEc:dgr:uvatin:20090086&r=ets
  3. By: Torben G. Andersen; Luca Benzoni
    Abstract: Given the importance of return volatility on a number of practical financial management decisions, the efforts to provide good real- time estimates and forecasts of current and future volatility have been extensive. The main framework used in this context involves stochastic volatility models. In a broad sense, this model class includes GARCH, but we focus on a narrower set of specifications in which volatility follows its own random process, as is common in models originating within financial economics. The distinguishing feature of these specifications is that volatility, being inherently unobservable and subject to independent random shocks, is not measurable with respect to observable information. In what follows, we refer to these models as genuine stochastic volatility models. Much modern asset pricing theory is built on continuous- time models. The natural concept of volatility within this setting is that of genuine stochastic volatility. For example, stochastic-volatility (jump-) diffusions have provided a useful tool for a wide range of applications, including the pricing of options and other derivatives, the modeling of the term structure of risk-free interest rates, and the pricing of foreign currencies and defaultable bonds. The increased use of intraday transaction data for construction of so-called realized volatility measures provides additional impetus for considering genuine stochastic volatility models. As we demonstrate below, the realized volatility approach is closely associated with the continuous-time stochastic volatility framework of financial economics. There are some unique challenges in dealing with genuine stochastic volatility models. For example, volatility is truly latent and this feature complicates estimation and inference. Further, the presence of an additional state variable - volatility - renders the model less tractable from an analytic perspective. We examine how such challenges have been addressed through development of new estimation methods and imposition of model restrictions allowing for closed-form solutions while remaining consistent with the dominant empirical features of the data.
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fedhwp:wp-09-04&r=ets
  4. By: Todd E. Clark; Michael W. McCracken
    Abstract: This paper develops bootstrap methods for testing whether, in a finite sample, competing out-of-sample forecasts from nested models are equally accurate. Most prior work on forecast tests for nested models has focused on a null hypothesis of equal accuracy in population - basically, whether coefficients on the extra variables in the larger, nesting model are zero. We instead use an asymptotic approximation that treats the coefficients as non-zero but small, such that, in a finite sample, forecasts from the small model are expected to be as accurate as forecasts from the large model. Under that approximation, we derive the limiting distributions of pairwise tests of equal mean square error, and develop bootstrap methods for estimating critical values. Monte Carlo experiments show that our proposed procedures have good size and power properties for the null of equal finite-sample forecast accuracy. We illustrate the use of the procedures with applications to forecasting stock returns and inflation.
    Keywords: Economic forecasting
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fedlwp:2009-050&r=ets
  5. By: Nikolaus Hautsch; Lada M. Kyj; Roel C.A. Oomen
    Abstract: We introduce a regularization and blocking estimator for well-conditioned high-dimensional daily covariances using high-frequency data. Using the Barndorff-Nielsen, Hansen, Lunde, and Shephard (2008a) kernel estimator, we estimate the covariance matrix block-wise and regularize it. A data-driven grouping of assets of similar trading frequency ensures the reduction of data loss due to refresh time sampling. In an extensive simulation study mimicking the empirical features of the S&P 1500 universe we show that the ’RnB’ estimator yields efficiency gains and outperforms competing kernel estimators for varying liquidity settings, noise-to-signal ratios, and dimensions. An empirical application of forecasting daily covariances of the S&P 500 index confirms the simulation results.
    Keywords: covariance estimation, blocking, realized kernel, regularization, microstructure, asynchronous trading
    JEL: C14 C22
    Date: 2009–10
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2009-049&r=ets
  6. By: Bert van Es; Peter Spreij; Harry van Zanten
    Abstract: Stochastic volatility modelling of financial processes has become increasingly popular. The proposed models usually contain a stationary volatility process. We will motivate and review several nonparametric methods for estimation of the density of the volatility process. Both models based on discretely sampled continuous time processes and discrete time models will be discussed. The key insight for the analysis is a transformation of the volatility density estimation problem to a deconvolution model for which standard methods exist. Three type of nonparametric density estimators are reviewed: the Fourier-type deconvolution kernel density estimator, a wavelet deconvolution density estimator and a penalized projection estimator. The performance of these estimators will be compared. Key words: stochastic volatility models, deconvolution, density estimation, kernel estimator, wavelets, minimum contrast estimation, mixing
    Date: 2009–10
    URL: http://d.repec.org/n?u=RePEc:arx:papers:0910.5185&r=ets

This nep-ets issue is ©2009 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.