nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒04‒23
nine papers chosen by
Yong Yin
SUNY at Buffalo

  1. Estimation of long memory in integrated variance By Eduardo Rossi; Paolo Santucci de Magistris
  2. Modelling and Forecasting Noisy Realized Volatility By Manabu Asai; Michael McAleer; Marcelo C. Medeiros
  3. A statistical test for forecast evaluation under a discrete loss function By Francisco J. Eransus; Alfonso Novales Cinca
  4. Modelling Regime Switching and Structural Breaks with an Infinite Dimension Markov Switching Model By Yong Song
  5. Generalized Cointegration: A New Concept with an Application to Health Expenditure and Health Outcomes By Stephen Hall; P. A. V. B. Swamy; George S. Tavlas
  6. Classical time-varying FAVAR models - estimation, forecasting and structural analysis By Eickmeier, Sandra; Lemke, Wolfgang; Marcellino, Massimiliano
  7. On some problems in discrete wavelet analysis of bivariate spectra with an application to business cycle synchronization in the euro zone By Bruzda, Joanna
  8. A Monte Carlo Analysis of the VAR-Based Indirect Inference Estimation of DSGE Models By David Dubois
  9. Multiplicative Error Models By Christian T. Brownlees; Fabrizio Cipollini; Giampiero M. Gallo

  1. By: Eduardo Rossi (University of Pavia); Paolo Santucci de Magistris (University of Padova and CREATES)
    Abstract: A stylized fact is that realized variance has long memory. We show that, when the instantaneous volatility is driven by a fractional Brownian motion, the integrated variance is characterized by long-range dependence. As a consequence, the realized variance inherits this property when prices are observed continuously and without microstructure noise, and the spectral densities of integrated and realized variance coincide. However, prices are not observed continuously, so that the realized variance is affected by a measurement error. Discrete sampling and market microstructure noise induce a finite-sample bias in the fractionally integration semiparametric estimates. A Monte Carlo simulation analysis provides evidence of such a bias for common sampling frequencies.
    Keywords: Realized variance, Long memory, fractional Brownian Motion, Measurement error, Whittle estimator.
    JEL: C10 C22 C80
    Date: 2011–04–12
  2. By: Manabu Asai (Faculty of Economics Soka University, Japan); Michael McAleer (Econometrisch Instituut (Econometric Institute), Faculteit der Economische Wetenschappen (Erasmus School of Economics) Erasmus Universiteit, Tinbergen Instituut (Tinbergen Institute).); Marcelo C. Medeiros (Department of Economics Pontifical Catholic University of Rio de Janeiro(PUC-Rio))
    Abstract: Several methods have recently been proposed in the ultra high frequency financial literature to remove the effects of microstructure noise and to obtain consistent estimates of the integrated volatility (IV) as a measure of ex-post daily volatility. Even bias-corrected and consistent realized volatility (RV) estimates of IV can contain residual microstructure noise and other measurement errors. Such noise is called “realized volatility error”. As such errors are ignored, we need to take account of them in estimating and forecasting IV. This paper investigates through Monte Carlo simulations the effects of RV errors on estimating and forecasting IV with RV data. It is found that: (i) neglecting RV errors can lead to serious bias in estimators; (ii) the effects of RV errors on one-step ahead forecasts are minor when consistent estimators are used and when the number of intraday observations is large; and (iii) even the partially corrected 2R recently proposed in the literature should be fully correcte d for evaluating forecasts. This paper proposes a full correction of 2 R . An empirical example for S&P 500 data is used to demonstrate the techniques developed in the paper.
    Keywords: realized volatility; diffusion; financial econometrics; measurement errors; forecasting; model evaluation; goodness-of-fit.
    JEL: G32 G11 C53 C22
    Date: 2011
  3. By: Francisco J. Eransus (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid); Alfonso Novales Cinca (Departamento de Economía Cuantitativa (Department of Quantitative Economics), Facultad de Ciencias Económicas y Empresariales (Faculty of Economics and Business), Universidad Complutense de Madrid)
    Abstract: We propose a new approach to evaluating the usefulness of a set of forecasts, based on the use of a discrete loss function de…ned on the space of data and forecasts. Exist- ing procedures for such an evaluation either do not allow for formal testing, or use tests statistics based just on the frequency distribution of (data , forecasts)-pairs. They can easily lead to misleading conclusions in some reasonable situations, because of the way they formalize the underlying null hypothesis that the set of forecasts is not useful. Even though the ambiguity of the underlying null hypothesis precludes us from per- forming a standard analysis of the size and power of the tests, we get results suggesting that the proposed DISC test performs better than its competitors.
    Keywords: Forecasting Evaluation, Loss Function.
    Date: 2011
  4. By: Yong Song
    Abstract: This paper proposes an infinite dimension Markov switching model to accommodate regime switching and structural break dynamics or a combination of both in a Bayesian framework. Two parallel hierarchical structures, one governing the transition probabilities and another governing the parameters of the conditional data density, keep the model parsimonious and improve forecasts. This nonparametric approach allows for regime persistence and estimates the number of states automatically. A global identification algorithm for structural changes versus regime switching is presented. Applications to U.S. real interest rates and inflation compare the new model to existing parametric alternatives. Besides identifying episodes of regime switching and structural breaks, the hierarchical distribution governing the parameters of the conditional data density provides significant gains to forecasting precision.
    Keywords: hidden Markov model; Bayesian nonparametrics; Dirchlet process
    JEL: C51 C53 C22 C11
    Date: 2011–04–15
  5. By: Stephen Hall; P. A. V. B. Swamy; George S. Tavlas
    Abstract: We propose a new generalization of the concept of cointegration that allows for the possibility that a set of variables are involved in an unknown nonlinear relationship. Although these variables may be unit-root non-stationary, there exists a nonlinear combination of them that takes account of such non-stationarity. We then introduce an estimation technique that allows us to test for the presence of this generalized cointegration in the absence of knowledge as to the true nonlinear functional form and the full set of regressors. We outline the basic stages of the technique and discuss how the issue of unit-root non-stationarity and cointegration affects each stage of the estimation procedure. We then apply this technique to the relationship between health expenditure and health outcomes, which is an important but controversial issue. A number of studies have found very little or no relationship between the level of health expenditure and outcomes. In econometric terms, if there is such a relationship then there should exist a cointegrating relationship between these two variables and possibly many others. The problem that arises is that we may be either unable to measure these other variables or that we do not know about them, in which case we may incorrectly find no relationship between health expenditures and outcomes. We then apply the concept of generalized cointegration; we obtain a highly significant relationship between health expenditure and health outcomes.
    Keywords: Generalized cointegration; non-stationarity; time-varying coefficient model; coefficient driver
    JEL: C13 C19 C22
    Date: 2011–03
  6. By: Eickmeier, Sandra; Lemke, Wolfgang; Marcellino, Massimiliano
    Abstract: We propose a classical approach to estimate factor-augmented vector autoregressive (FAVAR) models with time variation in the factor loadings, in the factor dynamics, and in the variance-covariance matrix of innovations. When the time-varying FAVAR is estimated using a large quarterly dataset of US variables from 1972 to 2007, the results indicate some changes in the factor dynamics, and more marked variation in the factors' shock volatility and their loading parameters. Forecasts from the time-varying FAVAR are more accurate than those from a constant parameter FAVAR for most variables and horizons when computed insample, for some variables in pseudo real time, mostly financial indicators. Finally, we use the time-varying FAVAR to assess how monetary transmission to the economy has changed. We find substantial time variation in the volatility of monetary policy shocks, and we observe that the reaction of GDP, the GDP deflator, inflation expectations and long-term interest rates to an equally-sized monetary policy shock has decreased since the early-1980s. --
    Keywords: FAVAR,time-varying parameters,monetary transmission,forecasting
    JEL: C3 C53 E52
    Date: 2011
  7. By: Bruzda, Joanna
    Abstract: The paper considers some of the problems emerging from discrete wavelet analysis of popular bivariate spectral quantities like the coherence and phase spectra and the frequency-dependent time delay. The approach taken here, introduced by Whitcher and Craigmile (2004), is based on the maximal overlap discrete Hilbert wavelet transform (MODHWT). Firstly, we point at a deficiency in the implementation of the MODHWT and suggest using a modified implementation scheme resembling the one applied in the context of the dual-tree complex wavelet transform of Kingsbury (see Selesnick et al., 2005). Secondly, via a broad set of simulation experiments we examine small and large sample properties of two wavelet estimators of the scale-dependent time delay. The estimators are: the wavelet cross-correlator and the wavelet phase angle-based estimator. Our results provide some practical guidelines for empirical examination of short- and medium-term lead-lag relations for octave frequency bands. Besides, we show how the MODHWT-based wavelet quantities can serve to approximate the Fourier bivariate spectra and discuss certain issues connected with building confidence intervals for them. The discrete wavelet analysis of coherence and phase angle is illustrated with a scale-dependent examination of business cycle synchronization between 11 euro zone member countries. The study is supplemented with wavelet analysis of variance and covariance of the euro zone business cycles. The empirical examination underlines good localization properties and high computational efficiency of the wavelet transformations applied, and provides new arguments in favour of the endogeneity hypothesis of the optimum currency area criteria as well as a wavelet evidence on dating the Great Moderation in the euro zone. --
    Keywords: Hilbert wavelet pair,MODHWT,wavelet coherence,wavelet phase angle,business cycle synchronization
    JEL: C19 E32 E58 O52
    Date: 2011
  8. By: David Dubois
    Abstract: In this paper we study estimation of DSGE models. More specifically, in the indirect inference framework, we analyze how critical is the choice of the reduced form model for estimation purposes. As it turns out, simple VAR parameters performs better than commonly used impulse response functions. This can be attributed to the fact that IRF worsen identification issues for models that are already plagued by that phenomenon.
    Date: 2011
  9. By: Christian T. Brownlees (Stern School of Business, New York University); Fabrizio Cipollini (Dipartimento di Statistica, Universita` di Firenze); Giampiero M. Gallo (Dipartimento di Statistica, Universita` di Firenze)
    Abstract: Financial time series analysis has focused on data related to market trading activity. Next to the modeling of the conditional variance of returns within the GARCH family of models, recent attention has been devoted to other variables: first, and foremost, volatility measured on the basis of ultra-high frequency data, but also volumes, number of trades, durations. In this paper, we examine a class of models, named Multiplicative Error Models, which are particularly suited to model such non-negative time series. We discuss the univariate specification, by considering the base choices for the conditional expectation and the error term. We provide also a general framework, allowing for richer specifications of the conditional mean. The outcome is a novel MEM (called Composite MEM) which is reminiscent of the short- and long-run component GARCH model by Engle and Lee (1999). Inference issues are discussed relative to Maximum Likelihood and Generalized Method of Moments estimation. In the application, we show the regularity in parameter estimates and forecasting performance obtainable by applying the MEM to the realized kernel volatility of components of the S&P100 index. We suggest extensions of the base model by enlarging the information set and adopting a multivariate specification.
    Keywords: Multiplicative Error Models, Realized Volatility, Financial Time Series, Composite MEM
    JEL: C22 C51 C52
    Date: 2011–02

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.