nep-ets New Economics Papers
on Econometric Time Series
Issue of 2009‒12‒19
eleven papers chosen by
Yong Yin
SUNY at Buffalo

  1. Combining VAR and DSGE forecast densities By Ida Wolden Bache; Anne Sofie Jore; James Mitchell; Shaun P. Vahey
  2. Monitoring Structural Changes in Regression with Long Memory Processes By  Wen-Jen Tsay
  3. A mixed splicing procedure for economic time series By Ángel de la Fuente
  4. Bayesian estimation of an extended local scale stochastic volatility model By Philippe J. Deschamps
  5. Detrending Bootstrap Unit Root Tests By Smeekes Stephan
  6. Characteristic function estimation of Ornstein-Uhlenbeck-based stochastic volatility models. By Emanuele Taufer; Nikolai Leonenko; Marco Bee
  7. A solution to the problem of too many instruments in dynamic panel data GMM By Mehrhoff, Jens
  8. High and Low Frequency Correlations in Global Equity Markets By Robert F. Engle; José Gonzalo Rangel
  9. High-Frequency and Model-Free Volatility Estimators By Robert Ślepaczuk; Grzegorz Zakrzewski
  10. Testing for Common Autocorrelation in Data Rich Environments By Gianluca Cubadda; Alain hecq
  11. Detecting Common Dynamics in Transitory Components By Tim M Christensen; Stan Hurn; Adrian Pagan

  1. By: Ida Wolden Bache (Norges Bank); Anne Sofie Jore (Norges Bank); James Mitchell (NIESR); Shaun P. Vahey (Melbourne Business School)
    Abstract: A popular macroeconomic forecasting strategy takes combinations across many models to hedge against instabilities of unknown timing; see (among others) Stock and Watson (2004), Clark and McCracken (2010), and Jore et al. (2010). Existing studies of this forecasting strategy exclude Dynamic Stochastic General Equilibrium (DSGE) models, despite the widespread use of these models by monetary policymakers. In this paper, we combine inflation forecast densities utilizing an ensemble system comprising many Vector Autoregressions (VARs), and a policymaking DSGE model. The DSGE receives substantial weight (for short horizons) provided the VAR components exclude structural breaks. In this case, the inflation forecast densities exhibit calibration failure. Allowing for structural breaks in the VARs reduces the weight on the DSGE considerably, and produces well-calibrated forecast densities for inflation.
    Keywords: Ensemble modeling, Forecast densities, Forecast evaluation, VAR models, DSGE models
    JEL: C32 C53 E37
    Date: 2009–11–05
  2. By:  Wen-Jen Tsay (Institute of Economics, Academia Sinica, Taipei, Taiwan)
    Abstract: This paper extends the °uctuation monitoring test of Chu et al. (1996) to the regression model involving stationary or nonstationary long memory regressors and errors by proposing two innovative on-line detectors. In spite of the general framework covered by these detectors, their computational cost is extremely mild in that they do not depend on the bootstrap procedure and do not involve the di±cult issues of choosing a kernel function, a bandwidth parameter, or an autoregressive lag length for the long-run variance estimation. Moreover, under suitable regularity conditions and the null hypothesis of no structural change, the asymptotic distributions of these two detectors are identical to that of the corresponding counterpart considered in Chu et al. (1996) where they consider the short memory processes
    Keywords: Structural stability, Long memory process, Fluctuation monitoring
    Date: 2009–08
  3. By: Ángel de la Fuente
    Abstract: This note develops a flexible methodology for splicing economic time series that avoids the extreme assumptions implicit in the procedures most commonly used in the literature. It allows the user to split the required correction to the older of the series being linked between its levels and growth rates on the basis what he knows or conjectures about the persistence of the factors that account for the discrepancy between the two series that emerges at their linking point. The time profile of the correction is derived from the assumption that the error in the older series reflects the inadequate coverage of emerging sectors or activities that grow faster than the aggregate.
    Keywords: linking, splicing, economic series
    JEL: C82 E01
    Date: 2009–12–03
  4. By: Philippe J. Deschamps (Department of Quantitative Economics)
    Abstract: A new version of the local scale model of Shephard (1994) is presented. Its features are identically distributed evolution equation disturbances, the incorporation of in-the-mean effects, and the incorporation of variance regressors. A Bayesian posterior simulator and an exact simulation smoother are presented. The model is applied to simulated data and to publicly available exchange rate and asset return data. Simulation smoothing turns out to be essential for the accurate interval estimation of volatilities. Bayes factors show that the new model is competitive with GARCH and Lognormal stochastic volatility formulations. Its forecasting performance is comparable to GARCH.
    Keywords: State space models; Markov chain Monte Carlo; simulation smoothing; generalized error distribution; generalized t distribution
    JEL: C11 C13 C15 C22
    Date: 2009–08–04
  5. By: Smeekes Stephan (METEOR)
    Abstract: The role of detrending in bootstrap unit root tests is investigated. When bootstrapping, detrending must not only be done for the construction of the test statistic, but also in the first step of the bootstrap algorithm. It is argued that the two points should be treated separately. Asymptotic validity of sieve bootstrap ADF unit root tests is shown for test statistics based on full sample and recursive OLS and GLS detrending. It is also shown that the detrending method in the first step of the bootstrap may differ from the one used in the construction of the test statistic. A simulation study is conducted to analyze the effects of detrending on finite sample performance of the bootstrap test. It is found that full sample detrending should be preferred in the first step of the bootstrap algorithm and that the decision about the detrending method used to obtain the test statistic should be based on the power properties of the corresponding asymptotic tests.
    Keywords: econometrics;
    Date: 2009
  6. By: Emanuele Taufer (DISA, Faculty of Economics, Trento University); Nikolai Leonenko; Marco Bee
    Abstract: Continuous-time stochastic volatility models are becoming increasingly popular in finance because of their flexibility in accommodating most stylized facts of financial time series. However, their estimation is difficult because the likelihood function does not have a closed-form expression. In this paper we propose a characteristic function-based estimation method for non-Gaussian Ornstein-Uhlenbeck-based stochastic volatility models. After deriving explicit expressions of the characteristic functions for various cases of interest we analyze the asymptotic properties of the estimators and evaluate their performance by means of a simulation experiment. Finally, a real-data application shows that the superposition of two Ornstein-Uhlenbeck processes gives a good approximation to the dependence structure of the process.
    Keywords: ornstein-uhlenbeck process; lévy process; stochastic volatility; characteristic function estimation
    Date: 2009–12
  7. By: Mehrhoff, Jens
    Abstract: The well-known problem of too many instruments in dynamic panel data GMM is dealt with in detail in Roodman (2009, Oxford Bull. Econ. Statist.). The present paper goes one step further by providing a solution to this problem: factorisation of the standard instrument set is shown to be a valid transformation for ensuring consistency of GMM. Monte Carlo simulations show that this new estimation technique outperforms other possible transformations by having a lower bias and RMSE as well as greater robustness of overidentifying restrictions. The researcher's choice of a particular transformation can be replaced by a data-driven statistical decision. --
    Keywords: Dynamic panel data,generalised method of moments,instrument proliferation,factor analysis
    JEL: C13 C15 C23 C81
    Date: 2009
  8. By: Robert F. Engle; José Gonzalo Rangel
    Abstract: This study models high and low frequency variation in global equity correlations using a comprehensive sample of 43 countries that includes developed and emerging markets, during the period 1995-2008. These two types of variations are modeled following the semi-parametric Factor-Spline-GARCH approach of Rangel and Engle (2008). This framework is extended and modified to incorporate the effect of multiple factors and to address the issue of non-synchronicity in international markets. Our empirical analysis suggests that the slow-moving dynamics of global correlations can be described by the Factor-Spline-GARCH specifications using either weekly or daily data. The analysis shows that the low frequency component of global correlations increased in the current financial turmoil; however, this increase was not equally distributed across countries. The countries that experienced the largest increase in correlations were mainly emerging markets.
    Keywords: Dynamic conditional correlations, high and low frequency variation, global markets, non-synchronicity.
    JEL: C32 C51 C52 G12 G15
    Date: 2009–12
  9. By: Robert Ślepaczuk (Faculty of Economic Sciences, University of Warsaw); Grzegorz Zakrzewski (Deutsche Bank PBC S.A.)
    Abstract: This paper focuses on volatility of financial markets, which is one of the most important issues in finance, especially with regard to modeling high-frequency data. Risk management, asset pricing and option valuation techniques are the areas where the concept of volatility estimators (consistent, unbiased and the most efficient) is of crucial concern. Our intention was to find the best estimator of true volatility taking into account the latest investigations in finance literature. Basing on the methodology presented in Parkinson (1980), Garman and Klass (1980), Rogers and Satchell (1991), Yang and Zhang (2000), Andersen et al. (1997, 1998, 1999a, 199b), Hansen and Lunde (2005, 2006b) and Martens (2007), we computed the various model-free volatility estimators and compared them with classical volatility estimator, most often used in financial models. In order to reveal the information set hidden in high-frequency data, we utilized the concept of realized volatility and realized range. Calculating our estimator, we carefully focused on Δ (the interval used in calculation), n (the memory of the process) and q (scaling factor for scaled estimators). Our results revealed that the appropriate selection of Δ and n plays a crucial role when we try to answer the question concerning the estimator efficiency, as well as its accuracy. Having nine estimators of volatility, we found that for optimal n (measured in days) and Δ (in minutes) we obtain the most efficient estimator. Our findings confirmed that the best estimator should include information contained not only in closing prices but in the price range as well (range estimators). What is more important, we focused on the properties of the formula itself, independently of the interval used, comparing the estimator with the same Δ, n and q parameter. We observed that the formula of volatility estimator is not as important as the process of selection of the optimal parameter n or Δ. Finally, we focused on the asymmetry between market turmoil and adjustments of volatility. Next, we put stress on the implications of our results for well-known financial models which utilize classical volatility estimator as the main input variable.
    Keywords: financial market volatility, high-frequency financial data, realized volatility and correlation, volatility forecasting, microstructure bias, the opening jump effect, the bid-ask bounce, autocovariance bias, daily patterns of volatility, emerging markets
    JEL: G14 G15 C61 C22
    Date: 2009
  10. By: Gianluca Cubadda (Faculty of Economics, University of Rome "Tor Vergata"); Alain hecq (Maastricht University)
    Abstract: This paper proposes a strategy to detect the presence of common serial correlation in high-dimensional systems. We show by simulations that univariate autocorrelation tests on the factors obtained by partial least squares outperform traditional tests based on canonical correlations.
    Keywords: Serial correlation common feature; high-dimensional systems; partial least squares. JEL code: C32
    JEL: C32
    Date: 2009–12–04
  11. By: Tim M Christensen (Yale); Stan Hurn (QUT); Adrian Pagan (QUT)
    Abstract: This paper considers VAR/VECM models for variables exhibiting cointegration and common features in the transitory components. While the presence of cointegration reduces the rank of the long-run multiplier matrix, other types of common features lead to rank reduction in the short-run dynamics. These common transitory components arise when linear combination of the first differenced variables in a cointegrated VAR are white noise. This paper offers a reinterpretation of the traditional approach to testing for common feature dynamics, namely checking for a singular covariance matrix for the transitory components. Instead, the matrix of short-run coefficients becomes the focus of the testing procedure thus allowing a wide range of tests for reduced rank in parameter matrices to be potentially relevant tests of common transitory components. The performance of the different methods is illustrated in a Monte Carlo analysis which is then used to reexamine an existing empirical study. Finally, this approach is applied to analyze whether one would observe common dynamics in standard DSGE models.
    Keywords: Transitory components, common features, reduced rank, cointegration.
    JEL: C14 C52
    Date: 2009–11–17

This nep-ets issue is ©2009 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.