nep-ets New Economics Papers
on Econometric Time Series
Issue of 2010‒01‒23
twelve papers chosen by
Yong Yin
SUNY at Buffalo

  1. Structural change tests for GEL criteria By Alain Guay; Jean-Francois Lamarche
  2. A Gaussian Test for Cointegration By Gulasekaran Rajaguru; Tilak Abeysinghe
  3. GMM, Generalized Empirical Likelihood, and Time Series By Federico Crudu
  4. Improving the Forecasting of Dynamic Conditional Correlation: a Volatility Dependent Approach By Edoardo Otranto
  5. The impacts of outliers on different estimators for GARCH processes: an empirical study By Ardelean, Vlad
  6. Spatial-serial dependency in multivariate GARCH models and dynamic copulas: a simulation study By Klein; Ingo; Köck; Christian; Tinkl; Fabian
  7. Bayesian Inference in a Stochastic Volatility Nelson-Siegel Model By Nikolaus Hautsch; Fuyu Yang
  8. Discrete Time and Finite State Reflected Backward Stochastic Difference Equations By Lifen An; Shaolin Ji
  9. Consistency properties of a simulation-based estimator for dynamic processes By Manuel S. Santos
  10. Simple GMM Estimation of the Semi-Strong GARCH(1,1) Model By Todd, Prono
  11. GARCH-Based Identification and Estimation of Triangular Systems By Todd, Prono
  12. Impact of Model Specification Decisions on Unit Root Tests By Atiq-ur-Rehman, Atiq-ur-Rehman; Zaman, Asad

  1. By: Alain Guay (Department of Economics, Universite du Quebec a Montreal); Jean-Francois Lamarche (Department of Economics, Brock University)
    Abstract: This paper examines structural change tests based on generalized empirical likelihood methods in the time series context. Standard structural change test for the generalized method of moments are adapted to generalized empirical likelihood context. We show that when moment conditions are properly smoothed, these test statistics converges to the same asymptotic distribution as in the generalized method of moments in cases with known and unknown breakpoints. We also suggest new structural change test statistics specific to generalized empirical likelihood methods of estimation. A simulation study examines the small sample properties of the tests.
    Keywords: Generalized empirical likelihood, generalized method of moments, parameter instability, structural change
    JEL: C12 C32
    Date: 2009–12
  2. By: Gulasekaran Rajaguru (School of Business, Bond University, Australia); Tilak Abeysinghe (Department of Economics, National University of Singapore)
    Abstract: We use a mixed-frequency regression technique to develop a test for cointegration under the null of stationarity of the deviations from a long-run relationship. What is noteworthy about this MA unit root test, based on a variance-difference, is that, instead of having to deal with non-standard distributions, it takes the testing back to the normal distribution and offers a way to increase power without having to increase the sample size substantially. Monte Carlo simulations show minimal size distortions even when the AR root is close to unity and that the test offers substantial gains in power against near-null alternatives in moderate size samples. An empirical exercise illustrates the relative usefulness of the test further.
    Keywords: Null of stationarity, MA unit root, mixed-frequency regression, variance difference, normal distribution, power.
    JEL: C12 C22
    Date: 2009–12
  3. By: Federico Crudu
    Abstract: In this paper we extend the results of Kitamura (1997) for BEL to the more general class of GEL estimators. The resulting BGEL estimator is proved to be consistent and asymptotically normal and attains the semiparametric lower bound. In addition, we define the BGEL version of the classical trinity of tests, Wald, Lagrange Multiplier, and Likelihood Ratio tests. The resulting tests are as expected chi square distributed. We find via Monte Carlo experiments that the overidentification tests that stem from the BGEL estimator have generally better small sample properties than the J test.
    JEL: C12 C14 C22
    Date: 2009
  4. By: Edoardo Otranto
    Abstract: Forecasting volatility in a multivariate framework has received many contributions in the recent literature, but problems in estimation are still frequently encountered when dealing with a large set of time series. The Dynamic Conditional Correlation (DCC) modeling is probably the most used approach; it has the advantage of separating the estimation of the volatility of each time series (with great flexibility, using single univariate models) and the correlation part (with the strong constraint imposing the same dynamics to all the correlations). We propose a modification to the DCC model, providing different dynamics for each correlation, simply hypothesizing a dependence on the volatility structure of each time series. This new model implies adding only two parameters with respect to the original DCC model. Its performance is evaluated in terms of out-of-sample forecasts with respect to the DCC models and other multivariate GARCH models. The results on four data sets seem to favor the new model.
    Keywords: Dynamic conditional correlation; GARCH distance; Multivariate
    JEL: C32 C53 G10
    Date: 2009
  5. By: Ardelean, Vlad
    Abstract: The Maximum likelihood estimation (MLE) is the most widely used method to estimate the parameters of a GARCH(p,q) process. This is owed to the fact that the MLE, among other properties, is asymptotically efficient. Even though the MLE is sensitive to outliers, which can occur in time series. In order to abate the influence of outliers, robust estimators are introduced. Afterwards an Monte Carlo study compares the introduced estimators. --
    Keywords: GARCH,Robust-Estimates,M-Estimates
    Date: 2009
  6. By: Klein; Ingo; Köck; Christian; Tinkl; Fabian
    Abstract: The serial dependency of multivariate financial data will often be filtered by considering the residuals of univariate GARCH models adapted to every single series. This is the correct filtering strategy if the multivariate process follows a so-called copula based multivariate dynamic model (CMD). These multivariate dynamic models combine univariate GARCH in a linear or nonlinear way. In these models the parameters of the marginal distribution (=univariate GARCH models) and the dependence parameter are separable in the sense that they can be estimated in two or more steps. In the first step the parameters of the marginal distribution will be estimated and in the second step the parameter(s) of dependence.To the class of CMD models belong several multivariate GARCH models like the CCC and the DCC model. In contrast the BEKK model, f.e., does not belong to this class. If the BEKK model is correctly specified the above mentioned filtering strategy could fail from a theoretical point of view. Up to now, it is not known which dynamic copula is incorporated in a BEKK model. We will show that if the distribution of the innovations (i.e. the residuals) of MGARCH models is spherical the conditional distribution of the whole MGARCH process belongs to the elliptical distribution family. Therefore estimating the dependence of a BEKK model by copulas from the elliptical family should be an appropriate strategy to identify the dependence (i.e. correlation) between the univariate time series. Furthermore we will show, that a diagonal BEKK model can be separated in its margins and a copula, but that this strategy falls short of investigating full BEKK models. --
    Date: 2009
  7. By: Nikolaus Hautsch; Fuyu Yang
    Abstract: In this paper, we develop and apply Bayesian inference for an extended Nelson- Siegel (1987) term structure model capturing interest rate risk. The so-called Stochastic Volatility Nelson-Siegel (SVNS) model allows for stochastic volatility in the underlying yield factors. We propose a Markov chain Monte Carlo (MCMC) algorithm to efficiently estimate the SVNS model using simulation-based inference. Applying the SVNS model to monthly U.S. zero-coupon yields, we find significant evidence for time-varying volatility in the yield factors. This is mostly true for the level and slope volatility revealing also the highest persistence. It turns out that the inclusion of stochastic volatility improves the model's goodness-of-fit and clearly reduces the forecasting uncertainty particularly in low-volatility periods. The proposed approach is shown to work efficiently and is easily adapted to alternative specifications of dynamic factor models revealing (multivariate) stochastic volatility.
    Keywords: term structure of interest rates, stochastic volatility, dynamic factor model, Markov chain Monte Carlo
    JEL: C5 C11 C32
    Date: 2010–01
  8. By: Lifen An; Shaolin Ji
    Abstract: In this paper, we firstly establish the discrete time and finite state reflected backward stochastic difference equations(DF-RBSDE for short); then we explore the corresponding basic properties and theorems including the Existence and Uniqueness Theorem as well as the Comparison Theorem in our framework by "one step" method; afterwards, we show the connections between DF-RBSDE and optimal stopping time problems. For applications, we study the connection between DF-RBSDE and the general theory of g-martingales and multiple prior martingale including Doob-Mayer Decomposition Theorem and Optional Sampling Theorem in our framework; and then we apply the theory of DF-RBSDEs to multiple prior martingale and optimal stopping problems under Knightian uncertainty; finally, applying the above theories, we consider the pricing models of American Option in complete and incomplete markets.
    Date: 2010–01
  9. By: Manuel S. Santos
    Abstract: This paper considers a simulation-based estimator for a general class of Markovian processes and explores some strong consistency properties of the estimator. The estimation problem is defined over a continuum of invariant distributions indexed by a vector of parameters. A key step in the method of proof is to show the uniform convergence (a.s.) of a family of sample distributions over the domain of parameters. This uniform convergence holds under mild continuity and monotonicity conditions on the dynamic process. The estimator is applied to an asset pricing model with technology adoption. A challenge for this model is to generate the observed high volatility of stock markets along with the much lower volatility of other real economic aggregates.
    Date: 2010–01
  10. By: Todd, Prono
    Abstract: Efficient GMM estimation of the semi-strong GARCH(1,1) model requires simultaneous estimation of the conditional third and fourth moments. This paper proposes a simple alternative to efficient GMM based upon the unconditional skewness of residuals and the autocovariances of squared residuals. An advantage of this simple alternative is that neither the third nor the fourth conditional moment needs to be estimated. A second advantage is that linear estimators apply to all of the parameters in the model, making estimation straightforward in practice. The proposed estimators are IV-like with potentially many instruments. Sequential estimation involves TSLS in a first step followed by linear GMM. Simultaneous estimation involves either two-step GMM or CUE. A Monte Carlo study of the proposed estimators is included.
    Keywords: GARCH; Time Series Heteroskedasticity; GMM; CUE; Many Moments; Conditional Moment Restrictions; Consistency; Robust Statistics
    JEL: C53 G12 C22
    Date: 2010–01
  11. By: Todd, Prono
    Abstract: The diagonal GARCH(1,1) model is shown to support identification of the triangular system and is argued as a higher moment analog to traditional exclusion restrictions. Estimators for this result include QML and GMM. For the GMM estimator, only partial parameterization of the conditional covariance matrix is required. An alternative weighting matrix for the GMM estimator is also proposed.
    Keywords: Triangular Systems; Endogeneity; Identification; Heteroskedasticity; Quasi Maximum Likelihood; Generalized Method of Moments; GARCH; QML; GMM
    JEL: C13 C32
    Date: 2009–09
  12. By: Atiq-ur-Rehman, Atiq-ur-Rehman; Zaman, Asad
    Abstract: Performance of unit tests depends on several specification decisions prior to their application e.g., whether or not to include a deterministic trend. Since there is no standard procedure for making such decisions, therefore the practitioners routinely make several arbitrary specification decisions. In Monte Carlo studies, the design of DGP supports these decisions, but for real data, such specification decisions are often unjustifiable and sometimes incompatible with data. We argue that the problems posed by choice of initial specification are quite complex and the existing voluminous literature on this issue treats only certain superficial aspects of this choice. We also show how these initial specifications affect the performance of unit root tests and argue that Monte Carlo studies should include these preliminary decisions to arrive at a better yardstick for evaluating such tests.
    Keywords: model specification; trend stationary; difference stationary
    JEL: C15 C22 C01
    Date: 2009

This nep-ets issue is ©2010 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.