nep-ets New Economics Papers
on Econometric Time Series
Issue of 2011‒07‒02
nine papers chosen by
Yong Yin
SUNY at Buffalo

  1. Ranking Multivariate GARCH Models by Problem Dimension: An Empirical Evaluation By Massimiliano Caporin; Michael McAleer
  2. On the Nonparametric Tests of Univariate GARCH Regression Models By Wasel Shadat
  3. Forecasting Contemporaneous Aggregates with Stochastic Aggregation Weights By Ralf Brüggemann; Helmut Lütkepohl
  4. Large Vector Auto Regressions By Song Song; Peter J. Bickel
  5. Dynamic Large Spatial Covariance Matrix Estimation in Application to Semiparametric Model Construction via Variable Clustering: the SCE approach By Song Song
  6. Hierarchical shrinkage in time-varying parameter models By Miguel, Belmonte; Gary, Koop; Dimitris, Korobilis
  7. Parametric inference and forecasting in continuously invertible volatility models By Wintenberger, Olivier; Cai, Sixiang
  8. Modelling Long Memory in REITs By John Cotter
  9. Multivariate Modelling of Daily REIT Volatility By John Cotter; Simon Stevenson

  1. By: Massimiliano Caporin (Dipartimento di Scienze Economiche "Marco Fanno" (Department of Economics and Management), Università degli Studi di Padova); Michael McAleer (Econometrisch Instituut (Econometric Institute), Faculteit der Economische Wetenschappen (Erasmus School of Economics), Erasmus Universiteit, Tinbergen Instituut (Tinbergen Institute).)
    Abstract: In the last 15 years, several Multivariate GARCH (MGARCH) models have appeared in the literature. Recent research has begun to examine MGARCH specifications in terms of their out-of-sample forecasting performance. In this paper, we provide an empirical comparison of a set of models, namely BEKK, DCC, Corrected DCC (cDCC) of Aeilli (2008), CCC, Exponentially Weighted Moving Average, and covariance shrinking, using historical data of 89 US equities. Our methods follow part of the approach described in Patton and Sheppard (2009), and the paper contributes to the literature in several directions. First, we consider a wide range of models, including the recent cDCC model and covariance shrinking. Second, we use a range of tests and approaches for direct and indirect model comparison, including the Weighted Likelihood Ratio test of Amisano and Giacomini (2007). Third, we examine how the model rankings are influenced by the cross-sectional dimension of the problem.
    Keywords: Covariance forecasting, model confidence set, model ranking, MGARCH, model comparison.
    JEL: C32 C53 C52
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1120&r=ets
  2. By: Wasel Shadat
    Date: 2011
    URL: http://d.repec.org/n?u=RePEc:man:sespap:1115&r=ets
  3. By: Ralf Brüggemann (Department of Economics, University of Konstanz, Germany); Helmut Lütkepohl (Department of Economics, European University Institute, Italy)
    Abstract: Many contemporaneously aggregated variables have stochastic aggregation weights. We compare different forecasts for such variables including univariate forecasts of the aggregate, a multivariate forecast of the aggregate that uses information from the disaggregate components, a forecast which aggregates a multivariate forecast of the disaggregate components and the aggregation weights, and a forecast which aggregates univariate forecasts for individual disaggregate components and the aggregation weights. In empirical illustrations based on aggregate GDP and money growth rates, we find forecast efficiency gains from using the information in the stochastic aggregation weights. A Monte Carlo study confirms that using the information on stochastic aggregation weights explicitly may result in forecast mean squared error reductions.
    Keywords: Aggregation, autoregressive process, mean squared error
    JEL: C32
    Date: 2011–04–21
    URL: http://d.repec.org/n?u=RePEc:knz:dpteco:1123&r=ets
  4. By: Song Song; Peter J. Bickel
    Abstract: One popular approach for nonstructural economic and financial forecasting is to include a large number of economic and financial variables, which has been shown to lead to significant improvements for forecasting, for example, by the dynamic factor models. A challenging issue is to determine which variables and (their) lags are relevant, especially when there is a mixture of serial correlation (temporal dynamics), high dimensional (spatial) dependence structure and moderate sample size (relative to dimensionality and lags). To this end, an \textit{integrated} solution that addresses these three challenges simultaneously is appealing. We study the large vector auto regressions here with three types of estimates. We treat each variable's own lags different from other variables' lags, distinguish various lags over time, and is able to select the variables and lags simultaneously. We first show the consequences of using Lasso type estimate directly for time series without considering the temporal dependence. In contrast, our proposed method can still produce an estimate as efficient as an \textit{oracle} under such scenarios. The tuning parameters are chosen via a data driven "rolling scheme" method to optimize the forecasting performance. A macroeconomic and financial forecasting problem is considered to illustrate its superiority over existing estimators.
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1106.3915&r=ets
  5. By: Song Song
    Abstract: To better understand the spatial structure of large panels of economic and financial time series and provide a guideline for constructing semiparametric models, this paper first considers estimating a large spatial covariance matrix of the generalized $m$-dependent and $\beta$-mixing time series (with $J$ variables and $T$ observations) by hard thresholding regularization as long as ${{\log J \, \cx^*(\ct)}}/{T} = \Co(1)$ (the former scheme with some time dependence measure $\cx^*(\ct)$) or $\log J /{T} = \Co(1)$ (the latter scheme with some upper bounded mixing coefficient). We quantify the interplay between the estimators' consistency rate and the time dependence level, discuss an intuitive resampling scheme for threshold selection, and also prove a general cross-validation result justifying this. Given a consistently estimated covariance (correlation) matrix, by utilizing its natural links with graphical models and semiparametrics, after "screening" the (explanatory) variables, we implement a novel forward (and backward) label permutation procedure to cluster the "relevant" variables and construct the corresponding semiparametric model, which is further estimated by the groupwise dimension reduction method with sign constraints. We call this the SCE (screen - cluster - estimate) approach for modeling high dimensional data with complex spatial structure. Finally we apply this method to study the spatial structure of large panels of economic and financial time series and find the proper semiparametric structure for estimating the consumer price index (CPI) to illustrate its superiority over the linear models.
    Date: 2011–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1106.3921&r=ets
  6. By: Miguel, Belmonte; Gary, Koop; Dimitris, Korobilis
    Abstract: In this paper, we forecast EU-area inflation with many predictors using time-varying parameter models. The facts that time-varying parameter models are parameter-rich and the time span of our data is relatively short motivate a desire for shrinkage. In constant coefficient regression models, the Bayesian Lasso is gaining increasing popularity as an effective tool for achieving such shrinkage. In this paper, we develop econometric methods for using the Bayesian Lasso with time-varying parameter models. Our approach allows for the coefficient on each predictor to be: i) time varying, ii) constant over time or iii) shrunk to zero. The econometric methodology decides automatically which category each coefficient belongs in. Our empirical results indicate the benefits of such an approach.
    Keywords: Forecasting; hierarchical prior; time-varying parameters; Bayesian Lasso
    JEL: C52 E37 C11 E47
    Date: 2011–06–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:31827&r=ets
  7. By: Wintenberger, Olivier; Cai, Sixiang
    Abstract: We introduce the notion of continuously invertible volatility models that relies on some Lyapunov condition and some regularity condition. We show that it is almost equivalent to the volatilities forecasting efficiency of the parametric inference approach based on the Stochastic Recurrence Equation (SRE) given in Straumann (2005). Under very weak assumptions, we prove the strong consistency and the asymptotic normality of an estimator based on the SRE. From this parametric estimation, we deduce a natural forecast of the volatility that is strongly consistent. We successfully apply this approach to recover known results on univariate and multivariate GARCH type models where our estimator coincides with the QMLE. In the EGARCH(1,1)model, we apply this approach to find a strongly consistence forecast and to prove that our estimator is asymptotically normal when the limiting covariance matrix exists. Finally, we give some encouraging empirical results of our approach on simulations and real data.
    Keywords: Invertibility; volatility models; parametric estimation; strong consistency; asymptotic normality; asymmetric GARCH; exponential GARCH; stochastic recurrence equation; stationarity.
    JEL: C13 C32 C53 C01
    Date: 2011–06–20
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:31767&r=ets
  8. By: John Cotter (University College Dublin, Ireland)
    Abstract: One stylized feature of financial volatility impacting the modeling process is long memory. This paper examines long memory for alternative risk measures, observed absolute and squared returns for Daily REITs and compares the findings for a non- REIT equity index. The paper utilizes a variety of tests for long memory finding evidence that REIT volatility does display persistence, in contrast to the actual return series. Trading volume is found to be strongly associated with long memory. The results do however suggest differences in the findings with regard to REITs in comparison to the broader equity sector which may be due to relatively thin trading during the sample period.
    Keywords: Long Memory, FGARCH, REITs
    Date: 2011–06–24
    URL: http://d.repec.org/n?u=RePEc:ucd:wpaper:2006/14&r=ets
  9. By: John Cotter (University College Dublin, Ireland); Simon Stevenson (University College Dublin, Ireland)
    Abstract: This paper examines volatility in REITs using multivariate GARCH based model. The Multivariate VAR-GARCH technique documents the return and volatility linkages between REIT sub-sectors and also examines the influence of other US equity series. The motivation is for investors to incorporate time-varying volatility and correlations in their portfolio selection. The results illustrate the difference in results when higher frequency daily data is tested in comparison to the monthly data that has been commonly used in the existing literature. The linkages both within the REIT sector and between REITs and related sectors such as value stocks are weaker than commonly found in monthly studies. The broad market would appear to be more influential in the daily case.
    Date: 2011–06–24
    URL: http://d.repec.org/n?u=RePEc:ucd:wpaper:2005/17&r=ets

This nep-ets issue is ©2011 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.