nep-ets New Economics Papers
on Econometric Time Series
Issue of 2007‒08‒08
nine papers chosen by
Yong Yin
SUNY at Buffalo

  1. An Alternative System GMM Estimation in Dynamic Panel Models By Housung Jung; Hyeog Ug Kwon
  2. Modelling Volatilities and Conditional Correlations in Futures Markets with a Multivariate t Distribution By Bahram Pesaran; M. Hashem Pesaran
  3. Estimating High-Frequency Based (Co-) Variances: A Unified Approach By Ingmar Nolte; Valeri Voev
  4. Two canonical VARMA forms: Scalar component models vis-à-vis the Echelon form By George Athanasopoulos; D.S. Poskitt; Farshid Vahid
  5. Optimal combination forecasts for hierarchical time series By Rob J. Hyndman; Roman A. Ahmed; George Athanasopoulos
  6. "The Conditional Limited Information Maximum Likelihood Approach to Dynamic Panel Structural Equations" By Naoto Kunitomo; Kentaro Akashi
  7. "Block Sampler and Posterior Mode Estimation for A Nonlinear and Non-Gaussian State-space Model with Correlated Errors" By Yasuhiro Omori; Toshiaki Watanabe
  8. Forecasting VARMA processes using VAR models and subspace-based state space models By Izquierdo, Segismundo S.; Hernández, Cesáreo; del Hoyo, Juan
  9. Robust M-estimation of multivariate conditionally heteroscedastic time series models with elliptical innovations By Boudt, Kris; Croux, Christophe

  1. By: Housung Jung; Hyeog Ug Kwon
    Abstract: The system GMM estimator in dynamic panel data models which combines two moment conditions, i.e., for the differenced equation and for the model in levels, is known to be more efficient than the first-difference GMM estimator. However, an initial optimal weight matrix is not known for the system estimation procedure. Therefore, we suggest the use of 'a suboptimal weight matrix' which may reduce the finite sample bias whilst increasing its efficiency. Using the Kantorovich inequality, we find that the potential efficiency gain becomes large when the variance of individual effects increases compared to the variance of the idiosyncratic errors. (Our Monte Carlo experiments show that the small sample properties of the suboptimal system estimator are shown to be much more reliable than any other conventional system GMM estimator in terms of bias and efficiency.
    Keywords: Dynamic panel data, sub-optimal weighting matrix, KI upper boud
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:hst:hstdps:d07-217&r=ets
  2. By: Bahram Pesaran (Wadhwani Asset Management, LLP); M. Hashem Pesaran (CIMF, Cambridge University, GSA Capital and IZA)
    Abstract: This paper considers a multivariate t version of the Gaussian dynamic conditional correlation (DCC) model proposed by Engle (2002), and suggests the use of devolatized returns computed as returns standardized by realized volatilities rather than by GARCH type volatility estimates. The t-DCC estimation procedure is applied to a portfolio of daily returns on currency futures, government bonds and equity index futures. The results strongly reject the normal-DCC model in favour of a t-DCC specification. The t-DCC model also passes a number of VaR diagnostic tests over an evaluation sample. The estimation results suggest a general trend towards a lower level of return volatility, accompanied by a rising trend in conditional cross correlations in most markets; possibly reflecting the advent of euro in 1999 and increased interdependence of financial markets.
    Keywords: volatilities and correlations, futures market, multivariate t, financial interdependence, VaR diagnostics
    JEL: C51 C52 G11
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp2906&r=ets
  3. By: Ingmar Nolte (University of Konstanz); Valeri Voev (University of Konstanz)
    Abstract: We propose a unified framework for estimating integrated variances and covariances based on simple OLS regressions, allowing for a general market microstructure noise specification. We show that our estimators can outperform in terms of the root mean squared error criterion the most recent and commonly applied estimators, such as the realized kernels of Barndorff-Nielsen, Hansen, Lunde & Shephard (2006), the two-scales realized variance of Zhang, Mykland & A¨ýt-Sahalia (2005), the Hayashi & Yoshida (2005) covariance estimator, and the realized variance and covariance with the optimal sampling frequency chosen after Bandi & Russell (2005a) and Bandi & Russell (2005b). The power of our methodology stems from the fact that instead of trying to correct the realized quantities for the noise, we identify both the true underlying integrated moments and the moments of the noise, which are also estimated within our framework. Apart from being simple to implement, an important property of our estimators is that they are quite robust to misspecifications of the noise process.
    Keywords: High frequency data, Realized volatility and covariance, Market microstructure
    JEL: G10 F31 C32
    Date: 2007–07–26
    URL: http://d.repec.org/n?u=RePEc:knz:cofedp:0707&r=ets
  4. By: George Athanasopoulos; D.S. Poskitt; Farshid Vahid
    Abstract: In this paper we study two methodologies which identify and specify canonical form VARMA models. The two methodologies are: (i) an extension of the scalar component methodology which specifies canonical VARMA models by identifying scalar components through canonical correlations analysis and (ii) the Echelon form methodology which specifies canonical VARMA models through the estimation of Kronecker indices. We compare the actual forms and the methodologies on three levels. Firstly we present a theoretical comparison. Secondly, we present a Monte-Carlo simulation study that compares the performance of the two methodologies in identifying some pre-specified data generating processes. Lastly we compare the out-of-sample forecast performance of the two forms when models are fitted to real macroeconomic data.
    Keywords: Echelon form, Identification, Multivariate time series, Scalar component, VARMA model.
    JEL: C32 C51
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2007-10&r=ets
  5. By: Rob J. Hyndman; Roman A. Ahmed; George Athanasopoulos
    Abstract: In many applications, there are multiple time series that are hierarchically organized and can be aggregated at several different levels in groups based on products, geography or some other features. We call these "hierarchical time series". They are commonly forecast using either a "bottom-up" or a "top-down" method. In this paper we propose a new approach to hierarchical forecasting which provides optimal forecasts that are better than forecasts produced by either a top-down or a bottom-up approach. Our method is based on independently forecasting all series at all levels of the hierarchy and then using a regression model to optimally combine and reconcile these forecasts. The resulting revised forecasts add up appropriately across the hierarchy, are unbiased and have minimum variance amongst all combination forecasts under some simple assumptions. We show in a simulation study that our method performs well compared to the top-down approach and the bottom-up method. It also allows us to construct prediction intervals for the resultant forecasts. Finally, we apply the method to forecasting Australian tourism demand where the data are disaggregated by purpose of visit and geographical region.
    Keywords: Bottom-up forecasting, combining forecasts, GLS regression, hierarchical forecasting, Moore-Penrose inverse, reconciling forecasts, top-down forecasting.
    JEL: C53 C32 C23
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2007-9&r=ets
  6. By: Naoto Kunitomo (Faculty of Economics, University of Tokyo); Kentaro Akashi (Graduate School of Economics, University of Tokyo)
    Abstract: We propose the conditional limited information maximum likelihood (CLIML) approach for estimating dynamic panel structural equation models. When there are dynamic effects and endogenous variables with individual effects at the same time, the CLIML estimation method for the doubly-filtered data does give not only a consistent estimation, but also it attains the asymptotic efficiency when the number of orthogonal condition is large. Our formulation includes Alvarez and Arellano (2003), Blundell and Bond (2000) and other linear dynamic panel models as special cases.
    Date: 2007–07
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2007cf503&r=ets
  7. By: Yasuhiro Omori (Faculty of Economics, University of Tokyo); Toshiaki Watanabe (Institute of Economic Research, Hitotsubashi University)
    Abstract: This article introduces a new efficient simulation smoother and disturbance smoother for general state-space models where there exists a correlation between error terms of the measurement and state equations. The state vector is divided into several blocks where each block consists of many state variables. For each block, corresponding disturbances are sampled simultaneously from their conditional posterior distribution. The algorithm is based on the multivariate normal approximation of the conditional posterior density and exploits a conventional simulation smoother for a linear and Gaussian state space model. The performance of our method is illustrated using two examples (1) stochastic volatility models with leverage effects and (2) stochastic volatility models with leverage effects and state-dependent variances. The popular single move sampler which samples a state variable at a time is also conducted for comparison in the first example. It is shown that our proposed sampler produces considerable improvement in the mixing property of the Markov chain Monte Carlo chain.
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2007cf508&r=ets
  8. By: Izquierdo, Segismundo S.; Hernández, Cesáreo; del Hoyo, Juan
    Abstract: VAR modelling is a frequent technique in econometrics for linear processes. VAR modelling offers some desirable features such as relatively simple procedures for model specification (order selection) and the possibility of obtaining quick non-iterative maximum likelihood estimates of the system parameters. However, if the process under study follows a finite-order VARMA structure, it cannot be equivalently represented by any finite-order VAR model. On the other hand, a finite-order state space model can represent a finite-order VARMA process exactly, and, for state-space modelling, subspace algorithms allow for quick and non-iterative estimates of the system parameters, as well as for simple specification procedures. Given the previous facts, we check in this paper whether subspace-based state space models provide better forecasts than VAR models when working with VARMA data generating processes. In a simulation study we generate samples from different VARMA data generating processes, obtain VAR-based and state-space-based models for each generating process and compare the predictive power of the obtained models. Different specification and estimation algorithms are considered; in particular, within the subspace family, the CCA (Canonical Correlation Analysis) algorithm is the selected option to obtain state-space models. Our results indicate that when the MA parameter of an ARMA process is close to 1, the CCA state space models are likely to provide better forecasts than the AR models. We also conduct a practical comparison (for two cointegrated economic time series) of the predictive power of Johansen restricted-VAR (VEC) models with the predictive power of state space models obtained by the CCA subspace algorithm, including a density forecasting analysis.
    Keywords: subspace algorithms; VAR; forecasting; cointegration; Johansen; CCA
    JEL: C53 C5 C51
    Date: 2006–10
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:4235&r=ets
  9. By: Boudt, Kris; Croux, Christophe
    Abstract: This paper proposes new methods for the econometric analysis of outlier contaminated multivariate conditionally heteroscedastic time series. Robust alternatives to the Gaussian quasi-maximum likelihood estimator are presented. Under elliptical symmetry of the innovation vector, consistency results for M-estimation of the general conditional heteroscedasticity model are obtained. We also propose a robust estimator for the cross-correlation matrix and a diagnostic check for correct specification of the innovation density function. In a Monte Carlo experiment, the effect of outliers on different types of M-estimators is studied. We conclude with a financial application in which these new tools are used to analyse and estimate the symmetric BEKK model for the 1980-2006 series of weekly returns on the Nasdaq and NYSE composite indices. For this dataset, robust estimators are needed to cope with the outlying returns corresponding to the stock market crash in 1987 and the burst of the dotcom bubble in 2000.
    Keywords: conditional heteroscedasticity; M-estimators; multivariate time series; outliers; quasi-maximum likelihood; robust methods
    JEL: C51 C13 C53 C32
    Date: 2007–07–27
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:4271&r=ets

This nep-ets issue is ©2007 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.