nep-ets New Economics Papers
on Econometric Time Series
Issue of 2009‒08‒22
five papers chosen by
Yong Yin
SUNY at Buffalo

  1. High order discretization schemes for stochastic volatility models By Benjamin Jourdain; Mohamed Sbai
  2. Wald Tests for Detecting Multiple Structural Changes in Persistence By Mohitosh Kejriwal; Pierre Perron; Jing Zhou
  3. Frequentist inference in weakly identified DSGE models By Pablo Guerron-Quintana; Atsushi Inoue; Lutz Kilian
  4. "Do We Really Need Both BEKK and DCC? A Tale of Two Covariance Models" By Massimiliano Caporin; Michael McAleer
  5. Bias Correction and Out-of-Sample Forecast Accuracy By Kim, Hyeongwoo; Durmaz, Nazif

  1. By: Benjamin Jourdain (CERMICS - Centre d'Enseignement et de Recherche en Mathématiques, Informatique et Calcul Scientifique - INRIA - Ecole Nationale des Ponts et Chaussées); Mohamed Sbai (CERMICS - Centre d'Enseignement et de Recherche en Mathématiques, Informatique et Calcul Scientifique - INRIA - Ecole Nationale des Ponts et Chaussées)
    Abstract: In usual stochastic volatility models, the process driving the volatility of the asset price evolves according to an autonomous one-dimensional stochastic differential equation. We assume that the coefficients of this equation are smooth. Using Itô's formula, we get rid, in the asset price dynamics, of the stochastic integral with respect to the Brownian motion driving this SDE. Taking advantage of this structure, we propose - a scheme, based on the Milstein discretization of this SDE, with order one of weak trajectorial convergence for the asset price, - a scheme, based on the Ninomiya-Victoir discretization of this SDE, with order two of weak convergence for the asset price. We also propose a specific scheme with improved convergence properties when the volatility of the asset price is driven by an Orstein-Uhlenbeck process. We confirm the theoretical rates of convergence by numerical experiments and show that our schemes are well adapted to the multilevel Monte Carlo method introduced by Giles [2008a,b].
    Keywords: discretization schemes, stochastic volatility models, weak trajectorial convergence, multilevel Monte Carlo
    Date: 2009–08–07
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-00409861_v1&r=ets
  2. By: Mohitosh Kejriwal; Pierre Perron; Jing Zhou
    Abstract: This paper considers the problem of testing for multiple structural changes in the persistence of a univariate time series. We propose sup-Wald tests of the null hypothesis that the process has an autoregressive unit root against the alternative hypothesis that the process alternates between stationary and unit root regimes. Both non-trending and trending cases are analyzed. We derive the limit distributions of the tests under the null and establish their consistency under the relevant alternatives. The computation of the test statistics as well as asymptotic critical values is facilitated by the dynamic programming algorithm proposed in Perron and Qu (2006) which allows the minimization of the sum of squared residuals under the alternative hypothesis while imposing within and cross regime restrictions on the parameters. Finally, we present Monte Carlo evidence to show that the proposed tests perform quite well in finite samples relative to those available in the literature.
    Keywords: structural change, persistence, Wald tests, unit root, parameter restrictions
    JEL: C22
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:pur:prukra:1223&r=ets
  3. By: Pablo Guerron-Quintana; Atsushi Inoue; Lutz Kilian
    Abstract: The authors show that in weakly identified models (1) the posterior mode will not be a consistent estimator of the true parameter vector, (2) the posterior distribution will not be Gaussian even asymptotically, and (3) Bayesian credible sets and frequentist confidence sets will not coincide asymptotically. This means that Bayesian DSGE estimation should not be interpreted merely as a convenient device for obtaining asymptotically valid point estimates and confidence sets from the posterior distribution. As an alternative, the authors develop a new class of frequentist confidence sets for structural DSGE model parameters that remains asymptotically valid regardless of the strength of the identification. The proposed set correctly reflects the uncertainty about the structural parameters even when the likelihood is flat, it protects the researcher from spurious inference, and it is asymptotically invariant to the prior in the case of weak identification.
    Keywords: Stochastic analysis ; Macroeconomics - Econometric models
    Date: 2009
    URL: http://d.repec.org/n?u=RePEc:fip:fedpwp:09-13&r=ets
  4. By: Massimiliano Caporin (Dipartimento di Scienze Economiche "Marco Fanno", Universita degli Studi di Padova); Michael McAleer (Econometric Institute, Erasmus School of Economics Erasmus University Rotterdam and Tinbergen Institute and Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics, University of Tokyo)
    Abstract: Large and very large portfolios of financial assets are routine for many individuals and organizations. The two most widely used models of conditional covariances and correlations are BEKK and DCC. BEKK suffers from the archetypal "curse of dimensionality" whereas DCC does not. This is a misleading interpretation of the suitability of the two models to be used in practice. The primary purposes of the paper are to define targeting as an aid in estimating matrices associated with large numbers of financial assets, analyze the similarities and dissimilarities between BEKK and DCC, both with and without targeting, on the basis of structural derivation, the analytical forms of the sufficient conditions for the existence of moments, and the sufficient conditions for consistency and asymptotic normality, and computational tractability for very large (that is, ultra high) numbers of financial assets, to present a consistent two step estimation method for the DCC model, and to determine whether BEKK or DCC should be preferred in practical applications.
    Date: 2009–08
    URL: http://d.repec.org/n?u=RePEc:tky:fseres:2009cf638&r=ets
  5. By: Kim, Hyeongwoo; Durmaz, Nazif
    Abstract: The least squares (LS) estimator suffers from signicant downward bias in autoregressive models that include an intercept. By construction, the LS estimator yields the best in-sample fit among a class of linear estimators notwithstanding its bias. Then, why do we need to correct for the bias? To answer this question, we evaluate the usefulness of the two popular bias correction methods, proposed by Hansen (1999) and So and Shin (1999), by comparing their out-of-sample forecast performances with that of the LS estimator. We find that bias-corrected estimators overall outperform the LS estimator. Especially, Hansen's grid bootstrap estimator combined with a rolling window method performs the best.
    Keywords: Small-Sample Bias; Grid Bootstrap; Recursive Mean Adjustment; Out-of-Sample Forecast; Diebold-Mariano Test
    JEL: C53
    Date: 2009–05
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:16780&r=ets

This nep-ets issue is ©2009 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.