nep-ets New Economics Papers
on Econometric Time Series
Issue of 2010‒05‒29
twelve papers chosen by
Yong Yin
SUNY at Buffalo

  1. On the Forecasting Accuracy of Multivariate GARCH Models By Sébastien Laurent; Jeroen V.K. Rombouts; Francesco Violante
  2. VARs, Cointegration and Common Cycle Restrictions By Heather M Anderson; Farshid Vahid
  3. Likelihood inference for a fractionally cointegrated vector autoregressive model By Søren Johansen; Morten Ørregaard Nielsen
  4. Are Forecast Updates Progressive? By Chang, C-L.; Franses, Ph.H.B.F.; McAleer, M.J.
  5. Decomposing bias in expert forecast By Franses, Ph.H.B.F.
  6. Block Structure Multivariate Stochastic Volatility Models By Manabu Asai; Massimiliano Caporin; Michael McAleer
  7. Modelling and Forecasting Noisy Realized Volatility By Manuabu Asai; Michael McAleer; Marcelo C. Medeiros
  8. Forecasting Realized Volatility with Linear and Nonlinear Univariate Models By Michael McAleer; Marcelo C. Medeiros
  9. Time Varying Dimension Models By Joshua C.C. Chan; Garry Koop; Roberto Leon Gonzales; Rodney W. Strachan
  10. Econometric analysis of high dimensional VARs featuring a dominant unit By M. Hashem Pesaran; Alexander Chudik
  11. Maximum likelihood estimation of factor models on data sets with arbitrary pattern of missing data By Marta Bańbura; Michele Modugno
  12. Multivariate heavy-tailed models for Value-at-Risk estimation By Carlo Marinelli; Stefano d'Addona; Svetlozar T. Rachev

  1. By: Sébastien Laurent; Jeroen V.K. Rombouts; Francesco Violante
    Abstract: This paper addresses the question of the selection of multivariate GARCH models in terms of variance matrix forecasting accuracy with a particular focus on relatively large scale problems. We consider 10 assets from NYSE and NASDAQ and compare 125 model based one-step-ahead conditional variance forecasts over a period of 10 years using the model confidence set (MCS) and the Superior Predictive Ability (SPA) tests. Model performances are evaluated using four statistical loss functions which account for different types and degrees of asymmetry with respect to over/under predictions. When considering the full sample, MCS results are strongly driven by short periods of high market instability during which multivariate GARCH models appear to be inaccurate. Over relatively unstable periods, i.e. dot-com bubble, the set of superior models is composed of more sophisticated specifications such as orthogonal and dynamic conditional correlation (DCC), both with leverage effect in the conditional variances. However, unlike the DCC models, our results show that the orthogonal specifications tend to underestimate the conditional variance. Over calm periods, a simple assumption like constant conditional correlation and symmetry in the conditional variances cannot be rejected. Finally, during the 2007-2008 financial crisis, accounting for non-stationarity in the conditional variance process generates superior forecasts. The SPA test suggests that, independently from the period, the best models do not provide significantly better forecasts than the DCC model of Engle (2002) with leverage in the conditional variances of the returns.
    Keywords: Variance matrix, forecasting, multivariate GARCH, loss function, model confidence set, superior predictive ability
    JEL: C10 C32 C51 C52 C53 G10
    Date: 2010
    URL: http://d.repec.org/n?u=RePEc:lvl:lacicr:1021&r=ets
  2. By: Heather M Anderson; Farshid Vahid
    Abstract: This paper argues that VAR models with cointegration and common cycles can be usefully viewed as observable factor models. The factors are linear combinations of lagged levels and lagged differences, and as such, these observable factors have potential for forecasting. We illustrate this forecast potential in both a Monte Carlo and empirical setting, and demonstrate the difficulties in developing forecasting "rules of thumb" for forecasting in multivariate systems.
    Keywords: Common factors, Cross equation restrictions, Multivariate forecasting, Reduced rank models.
    JEL: C32 C53 E37
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2010-14&r=ets
  3. By: Søren Johansen (University of Copenhagen and CREATES); Morten Ørregaard Nielsen (Queen's University and CREATES)
    Abstract: We consider model based inference in a fractionally cointegrated (or cofractional) vector autoregressive model based on the conditional Gaussian likelihood. The model allows the process X_{t} to be fractional of order d and cofractional of order d-b; that is, there exist vectors β for which β′X_{t} is fractional of order d-b. The parameters d and b satisfy either d≥b≥1/2, d=b≥1/2, or d=d_{0}≥b≥1/2. Our main technical contribution is the proof of consistency of the maximum likelihood estimators on the set 1/2≤b≤d≤d_{1} for any d_{1}≥d_{0}. To this end, we consider the conditional likelihood as a stochastic process in the parameters, and prove that it converges in distribution when errors are i.i.d. with suitable moment conditions and initial values are bounded. We then prove that the estimator of β is asymptotically mixed Gaussian and estimators of the remaining parameters are asymptotically Gaussian. We also find the asymptotic distribution of the likelihood ratio test for cointegration rank, which is a functional of fractional Brownian motion of type II.
    Keywords: Cofractional processes, cointegration rank, fractional cointegration, likelihood inferencw, vector autoregressive model
    JEL: C32
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1237&r=ets
  4. By: Chang, C-L.; Franses, Ph.H.B.F.; McAleer, M.J.
    Abstract: Macro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average, as the actual value is approached. Otherwise, forecast updates would be neutral. The paper proposes a methodology to test whether forecast updates are progressive and whether econometric models are useful in updating forecasts. The data set for the empirical analysis are for Taiwan, where we have three decades of quarterly data available of forecasts and updates of the inflation rate and real GDP growth rate. The actual series for both the inflation rate and the real GDP growth rate are always released by the government one quarter after the release of the revised forecast, and the actual values are not revised after they have been released. Our empirical results suggest that the forecast updates for Taiwan are progressive, and can be explained predominantly by intuition. Additionally, the one-, two- and three-quarter forecast errors are predictable using publicly available information for both the inflation rate and real GDP growth rate, which suggests that the forecasts can be improved.
    Keywords: macro-economic forecasts;econometric models;intuition;initial forecast;primary forecast;revised forecast;actual value;progressive forecast updates;forecast errors
    Date: 2010–04–29
    URL: http://d.repec.org/n?u=RePEc:dgr:eureir:1765019358&r=ets
  5. By: Franses, Ph.H.B.F.
    Abstract: Forecasts in the airline industry are often based in part on statistical models but mostly on expert judgment. It is frequently documented in the forecasting literature that expert forecasts are biased but that their accuracy is higher than model forecasts. If an expert forecast can be approximated by the weighted sum of a part that can be replicated by an analyst and a non-replicable part containing managerial intuition, the question arises which of two causes the bias. This paper advocates a simple regression-based strategy to decompose bias in expert forecasts. An illustration of the method to a unique database on airline revenues shows how it can be used to improve their experts’ forecasts.
    Keywords: expert forecasts;forecast bias;airline revenues
    Date: 2010–04–29
    URL: http://d.repec.org/n?u=RePEc:dgr:eureir:1765019359&r=ets
  6. By: Manabu Asai; Massimiliano Caporin; Michael McAleer (University of Canterbury)
    Abstract: Most multivariate variance models suffer from a common problem, the “curse of dimensionality”. For this reason, most are fitted under strong parametric restrictions that reduce the interpretation and flexibility of the models. Recently, the literature has focused on multivariate models with milder restrictions, whose purpose was to combine the need for interpretability and efficiency faced by model users with the computational problems that may emerge when the number of assets is quite large. We contribute to this strand of the literature proposing a block-type parameterization for multivariate stochastic volatility models.
    Keywords: Block structures; multivariate stochastic volatility; curse of dimensionality
    JEL: C32 C51 C10
    Date: 2010–05–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:10/24&r=ets
  7. By: Manuabu Asai; Michael McAleer (University of Canterbury); Marcelo C. Medeiros
    Abstract: Several methods have recently been proposed in the ultra high frequency financial literature to remove the effects of microstructure noise and to obtain consistent estimates of the integrated volatility (IV) as a measure of ex-post daily volatility. Even bias-corrected and consistent (modified) realized volatility (RV) estimates of the integrated volatility can contain residual microstructure noise and other measurement errors. Such noise is called “realized volatility error”. Since such measurement errors are ignored, we need to take account of them in estimating and forecasting IV. This paper investigates through Monte Carlo simulations the effects of RV errors on estimating and forecasting IV with RV data. It is found that: (i) neglecting RV errors can lead to serious bias in estimators due to model misspecification; (ii) the effects of RV errors on one-step ahead forecasts are minor when consistent estimators are used and when the number of intraday observations is large; and (iii) even the partially corrected recently proposed in the literature should be fully corrected for evaluating forecasts. This paper proposes a full correction of , which can be applied to linear and nonlinear, short and long memory models. An empirical example for S&P 500 data is used to demonstrate that neglecting RV errors can lead to serious bias in estimating the model of integrated volatility, and that the new method proposed here can eliminate the effects of the RV noise. The empirical results also show that the full correction for is necessary for an accurate description of goodness-of-fit.
    Keywords: Realized volatility; diffusion; financial econometrics; measurement errors; forecasting; model evaluation; goodness-of-fit
    Date: 2010–05–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:10/21&r=ets
  8. By: Michael McAleer (University of Canterbury); Marcelo C. Medeiros
    Abstract: In this paper we consider a nonlinear model based on neural networks as well as linear models to forecast the daily volatility of the S&P 500 and FTSE 100 futures. As a proxy for daily volatility, we consider a consistent and unbiased estimator of the integrated volatility that is computed from high frequency intra-day returns. We also consider a simple algorithm based on bagging (bootstrap aggregation) in order to specify the models analyzed.
    Keywords: Financial econometrics; volatility forecasting; neural networks; nonlinear models; realized volatility; bagging
    Date: 2010–05–01
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:10/28&r=ets
  9. By: Joshua C.C. Chan; Garry Koop; Roberto Leon Gonzales; Rodney W. Strachan
    Abstract: Abstract: Time varying parameter (TVP) models have enjoyed an increasing popularity in empirical macroeconomics. However, TVP models are parameter-rich and risk over-fitting unless the dimension of the model is small. Motivated by this worry, this paper proposes several Time Varying dimension (TVD) models where the dimension of the model can change over time, allowing for the model to automatically choose a more parsimonious TVP representation, or to switch between different parsimonious representations. Our TVD models all fall in the category of dynamic mixture models. We discuss the properties of these models and present methods for Bayesian inference. An application involving US in.ation forecasting illustrates and compares the different TVD models. We find our TVD approaches exhibit better forecasting performance than several standard benchmarks and shrink towards parsimonious specifications.
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:acb:cbeeco:2010-523&r=ets
  10. By: M. Hashem Pesaran (Cambridge University, Faculty of Economics, Austin Robinson Building, Sidgwick Avenue, Cambridge, CB3 9DD, United Kingdom.); Alexander Chudik (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.)
    Abstract: This paper extends the analysis of in…finite dimensional vector autoregressive models (IVAR) proposed in Chudik and Pesaran (2010) to the case where one of the variables or the cross section units in the IVAR model is dominant or pervasive. This extension is not straightforward and involves several technical difficulties. The dominant unit influences the rest of the variables in the IVAR model both directly and indirectly, and its effects do not vanish even as the dimension of the model (N) tends to in…nity. The dominant unit acts as a dynamic factor in the regressions of the non-dominant units and yields an infi…nite order distributed lag relationship between the two types of units. Despite this it is shown that the effects of the dominant unit as well as those of the neighborhood units can be consistently estimated by running augmented least squares regressions that include distributed lag functions of the dominant unit. The asymptotic distribution of the estimators is derived and their small sample properties investigated by means of Monte Carlo experiments. JEL Classification: C10, C33, C51.
    Keywords: IVAR Models, Dominant Units, Large Panels, Weak and Strong Cross Section Dependence, Factor Models.
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20101194&r=ets
  11. By: Marta Bańbura (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.); Michele Modugno (European Central Bank, Kaiserstrasse 29, 60311 Frankfurt am Main, Germany.)
    Abstract: In this paper we propose a methodology to estimate a dynamic factor model on data sets with an arbitrary pattern of missing data. We modify the Expectation Maximisation (EM) algorithm as proposed for a dynamic factor model by Watson and Engle (1983) to the case with general pattern of missing data. We also extend the model to the case with serially correlated idiosyncratic component. The framework allows to handle efficiently and in an automatic manner sets of indicators characterized by different publication delays, frequencies and sample lengths. This can be relevant e.g. for young economies for which many indicators are compiled only since recently. We also show how to extract a model based news from a statistical data release within our framework and we derive the relationship between the news and the resulting forecast revision. This can be used for interpretation in e.g. nowcasting applications as it allows to determine the sign and size of a news as well as its contribution to the revision, in particular in case of simultaneous data releases. We evaluate the methodology in a Monte Carlo experiment and we apply it to nowcasting and backdating of euro area GDP. JEL Classification: C53, E37.
    Keywords: Factor Models, Forecasting, Large Cross-Sections, Missing data, EM algorithm.
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:ecb:ecbwps:20101189&r=ets
  12. By: Carlo Marinelli; Stefano d'Addona; Svetlozar T. Rachev
    Abstract: For purposes of Value-at-Risk estimation, we consider three multivariate families of heavy-tailed distributions, which can be seen as multidimensional versions of Paretian stable and Student's t distributions allowing different marginals to have different tail thickness. After a discussion of relevant estimation and simulation issues, we conduct a backtesting study on a set of portfolios containing derivative instruments, using historical US stock price data.
    Date: 2010–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1005.2862&r=ets

This nep-ets issue is ©2010 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.