nep-ets New Economics Papers
on Econometric Time Series
Issue of 2012‒07‒23
seven papers chosen by
Yong Yin
SUNY at Buffalo

  1. Testing DSGE models by Indirect inference and other methods: some Monte Carlo experiments By Le, Vo Phuong Mai; Meenagh, David; Minford, Patrick; Wickens, Michael
  2. Residual test for cointegration with GLS detrended data By Pierre Perron; Gabriel Rodriguez
  3. Invariance properties of random vectors and stochastic processes based on the zonoid concept By Ilga Molchanov; Michael Schmutz; Kaspar Stucki
  4. Testing Causality Between Two Vectors in Multivariate GARCH Models By Tomasz Wozniak
  5. Copula-Based Dynamic Conditional Correlation Multiplicative Error Processes By Taras Bodnar; Nikolaus Hautsch; ;
  6. Realized Volatility and Change of Regimes By Giampiero M. Gallo; Edoardo Otranto
  7. A higher order correlation unscented Kalman filter By Oliver Grothe

  1. By: Le, Vo Phuong Mai (Cardiff Business School); Meenagh, David (Cardiff Business School); Minford, Patrick (Cardiff Business School); Wickens, Michael (Cardiff Business School)
    Abstract: Using Monte Carlo experiments, we examine the performance of Indirect Inference tests of DSGE models, usually versions of the Smets-Wouters New Keynesian model of the US postwar period. We compare these with tests based on direct inference (using the Likelihood Ratio), and on the Del Negro–Schorfheide DSGE–VAR weight. We ?nd that the power of all three tests is substantial so that a false model will tend to be rejected by all three; but that the power of the indirect inference tests are by far the greatest, necessitating re-estimation by indirect inference to ensure that the model is tested in its fullest sense.
    Keywords: Bootstrap; DSGE; New Keynesian; New Classical; indirect inference; Wald statistic; likelihood ratio; DSGE-VAR weight
    JEL: C12 C32 C52 E1
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:cdf:wpaper:2012/15&r=ets
  2. By: Pierre Perron (Boston University); Gabriel Rodriguez (Departamento de Economía - Pontificia Universidad Católica del Perú)
    Abstract: We analyze di¤erent residual-based tests for the null of no cointegration using GLS detrended data. We …nd and simulate the limiting distributions of these statistics when GLS demeaned and GLS detrended data are used. The distributions depend of the number of right-hand side variables, the type of deterministic components used in the cointegration equation, and a nuisance parameter R2 which measures the long-run correlation between xt and yt. We present an extensive number of Figures which show the asymptotic power functions of the di¤erent statistics analyzed in this paper. The results show that GLS allows to obtain more asymptotic power in comparison with OLS detrending. The more simple residual-based tests (as the ADF) shows power gains for small values of R2 and for only one right-hand side variable. This evidence is valid for R2 less than 0.4. Figures shows that when R2 is larger, the ECR statistics are better for any value of the right-hand side variables. In particular, evidence shows that the ECR statistic which assumes a known cointegration vector is the most powerful. A set of simulated asymptotic critical values are also presented. Unlike other references, in the present framework we use di¤erent c for di¤erent number of right-hand side variables (xt variables) and according to the set of deterministic components. In this selection, we use a R2 = 0:4, which appears to be a sensible choice.
    Keywords: Cointegration, Residual-Based Unit Root Test, ECR Test, OLS and GLS Detrented Data, Hypothesis Testing
    JEL: C2 C3 C5
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:pcp:pucwps:wp00327&r=ets
  3. By: Ilga Molchanov; Michael Schmutz; Kaspar Stucki
    Abstract: Two integrable random vectors and in IRd are said to be zonoid equivalent if, for each uÎ IRd, the scalar products ,u and *,u have the same first absolute moments. The paper analyses stochastic processes whose finite-dimensional distributions are zonoid equivalent with respect to time shift (zonoid stationarity) and permutation of time moments (swap-invariance). While the first concept is weaker than the stationarity, the second one is a weakening of the exchangeability property. It is shown that nonetheless the ergodic theorem holds for swap invariant sequences and the limits are characterized
    Keywords: Invariance, Zonoid, Exchangeability, Ergodic theorem, Isometry
    Date: 2012–06
    URL: http://d.repec.org/n?u=RePEc:cte:wsrepe:ws122014&r=ets
  4. By: Tomasz Wozniak
    Abstract: Spillover and contagion eects have gained significant interest in the recent years of financial crisis. Attention has not only been directed to relations between returns of financial variables, but to spillovers in risk as well. I use the family of Constant Conditional Correlation GARCH models to model the risk associated with financial time series and to make inferences about Granger causal relations between second conditional moments. The restrictions for second-order Granger noncausality between two vectors of variables are derived. To assess the credibility of the noncausality hypotheses, I employ Bayes factors. Bayesian testing procedures have not yet been applied to the problem of testing Granger noncausality. Contrary to classical tests, Bayes factors make such testing possible, regardless of the form of the restrictions on the parameters of the model. Moreover, they relax the assumptions about the existence of higher-order moments of the processes required in classical tests.
    Keywords: Second-Order Causality; Volatility Spillovers; Bayes Factors; GARCH Models
    JEL: C11 C12 C32 C53
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:mlb:wpaper:1139&r=ets
  5. By: Taras Bodnar; Nikolaus Hautsch; ;
    Abstract: We introduce a copula-based dynamic model for multivariate processes of (non-negative) high-frequency trading variables revealing time-varying conditional variances and correlations. Modeling the variables’ conditional mean processes using a multiplicative error model we map the resulting residuals into a Gaussian domain using a Gaussian copula. Based on high-frequency volatility, cumulative trading volumes, trade counts and market depth of various stocks traded at the NYSE, we show that the proposed copula-based transformation is supported by the data and allows disentangling (multivariate) dynamics in higher order moments. To capture the latter, we propose a DCC-GARCH specification. We suggest estimating the model by composite maximum likelihood which is sufficiently flexible to be applicable in high dimensions. Strong empirical evidence for time-varying conditional (co-)variances in trading processes supports the usefulness of the approach. Taking these higher-order dynamics explicitly into account significantly improves the goodness-of-fit of the multiplicative error model and allows capturing time-varying liquidity risks.
    Keywords: multiplicative error model, trading processes, copula, DCC-GARCH, liquidity risk
    JEL: C32 C58 C46
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:hum:wpaper:sfb649dp2012-044&r=ets
  6. By: Giampiero M. Gallo (Dipartimento di Statistica, Universita` di Firenze); Edoardo Otranto (Università degli Studi di Messina, Dipartimento di Scienze Cognitive e della Formazione)
    Abstract: Persistence and occasional abrupt changes in the average level characterize the dynamics of high frequency based measures of volatility. Since the beginning of the 2000s, this pattern can be attributed to the dot com bubble, the quiet period of expansion of credit between 2003 and 2006 and then the harsh times after the burst of the subprime mortgage crisis. We conjecture that the inadequacy of many econometric volatility models (a very high level of estimated persistence, serially correlated residuals) can be solved with an adequate representation of such a pattern. We insert a Markovian dynamics in a Multiplicative Error Model to represent the conditional expectation of the realized volatility, allowing us to address the issues of a slow moving average level of volatility and of a different dynamics across regime. We apply the model to realized volatility of the S&P500 index and we gauge the usefulness of such an approach by a more interpretable persistence, better residual properties, and an increased goodness of fit.
    Keywords: Multiplicative Error Models, regime switching, realized volatility, volatility persistence
    JEL: C22 C51 C52 C58
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:fir:econom:wp2012_02&r=ets
  7. By: Oliver Grothe
    Abstract: Many nonlinear extensions of the Kalman filter, e.g., the extended and the unscented Kalman filter, reduce the state densities to Gaussian densities. This approximation gives sufficient results in many cases. However, this filters only estimate states that are correlated with the observation. Therefore, sequential estimation of diffusion parameters, e.g., volatility, which are not correlated with the observations is not possible. While other filters overcome this problem with simulations, we extend the measurement update of the Gaussian two-moment filters by a higher order correlation measurement update. We explicitly state formulas for a higher order unscented Kalman filter within a continuous-discrete state space. We demonstrate the filter in the context of parameter estimation of an Ornstein-Uhlenbeck process.
    Date: 2012–07
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1207.4300&r=ets

This nep-ets issue is ©2012 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.