nep-ets New Economics Papers
on Econometric Time Series
Issue of 2017‒06‒25
five papers chosen by
Yong Yin
SUNY at Buffalo

  1. Bayesian Unit Root Test for Panel Data By Jitendra Kumar; Anoop Chaturvedi; Umme Afifa
  2. The perils of counterfactual analysis with integrated processes By Carvalho, Carlos Viana de; Masini, Ricardo Pereira; Medeiros, Marcelo C.
  3. Change-in-Mean Tests in Long-memory Time Series: A Review of Recent Developments By Wenger, Kai; Leschinski, Christian; Sibbertsen, Philipp
  4. Time-varying mixed frequency forecasting: A real-time experiment By Stefan Neuwirth
  5. The Correct Regularity Condition and Interpretation of Asymmetry in EGARCH By Chia-Lin Chang; Michael McAleer

  1. By: Jitendra Kumar; Anoop Chaturvedi; Umme Afifa
    Abstract: Present paper studies the panel data auto regressive (PAR) time series model for testing the unit root hypothesis. The posterior odds ratio (POR) is derived under appropriate prior assumptions and then empirical analysis is carried out for testing the unit root hypothesis of Net Asset Value of National Pension schemes (NPS) for different fund managers. The unit root hypothesis for the model with linear time trend and linear time trend with augmentation term is carried out. The estimated autoregressive coefficient is far away from one in case of linear time trend only so, testing is not executed but in consideration of augmentation term, it is close to one. Therefore, we performed the unit root hypothesis testing using the derived POR. In all cases unit root hypothesis is rejected therefore all NPS series are concluded trend stationary.
    Keywords: Panel data, Stationarity, Autoregressive time series, Unit root, Posterior odds ratio, New Pension Scheme, Net Asset Value.
    JEL: C11 C12 C22 C23 C39
    Date: 2017–01–02
  2. By: Carvalho, Carlos Viana de; Masini, Ricardo Pereira; Medeiros, Marcelo C.
    Abstract: Recently, there has been a growing interest in developing econometric tools to conduct counterfactual analysis with aggregate data when a "treated" unit suffers an intervention, such as a policy change, and there is no obvious control group. Usually, the proposed methods are based on the construction of an artificial counterfactual from a pool of "untreated" peers, organized in a panel data structure. In this paper, we investigate the consequences of applying such methodologies when the data are formed by integrated process of order 1. We find that without a cointegration relation (spurious case) the intervention estimator diverges resulting in the rejection of the hypothesis of no intervention effect regardless of its existence. Whereas, for the case when at least one cointegration relation exists, we have a √T-consistent estimator for the intervention effect albeit with a non-standard distribution. However, even in this case, the test of no intervention effect is extremely oversized if nonstationarity is ignored. When a drift is present in the data generating processes, the estimator for both cases (cointegrated and spurious) either diverges or is not well defined asymptotically. As a final recommendation we suggest to work in first-differences to avoid spurious results.
    Date: 2017–06–13
  3. By: Wenger, Kai; Leschinski, Christian; Sibbertsen, Philipp
    Abstract: It is well known that standard tests for a mean shift are invalid in long-range dependent time series. Therefore, several long memory robust extensions of standard testing principles for a change-in-mean have been proposed in the literature. These can be divided into two groups: those that utilize consistent estimates of the long-run variance and self-normalized test statistics. Here, we review this literature and complement it by deriving a new long memory robust version of the sup-Wald test. Apart from giving a systematic review, we conduct an extensive Monte Carlo study to compare the relative performance of these methods. Special attention is paid to the interaction of the test results with the estimation of the long-memory parameter. Furthermore, we show that the power of self-normalized test statistics can be improved considerably by using an estimator that is robust to mean shifts.
    Keywords: Fractional Integration; Structural Breaks; Long Memory
    JEL: C12 C22
    Date: 2017–06
  4. By: Stefan Neuwirth (KOF Swiss Economic Institute, ETH Zurich, Switzerland)
    Abstract: This paper tests the usefulness of time-varying parameters when forecasting with mixed-frequency data. For this we compare the forecast performance of bridge equations and unrestriced MIDAS models with constant and time-varying parameters. An out-of-sample forecasting exercise with US real-time data shows that the use of time-varying parameters does not improve forecasts significantly over all vintages. However, since the Great Recession, forecast errors are smaller when forecasting with bridge equations due to the ability of time-varying parameters to incorporate gradual structural changes faster.
    Date: 2017–06
  5. By: Chia-Lin Chang (Department of Applied Economics Department of Finance National Chung Hsing University Taichung, Taiwan.); Michael McAleer (Department of Quantitative Finance National Tsing Hua University, Taiwan and Econometric Institute Erasmus School of Economics Erasmus University Rotterdam, The Netherlands and Department of Quantitative Economics Complutense University of Madrid, Spain And Institute of Advanced Sciences Yokohama National University, Japan.)
    Abstract: In the class of univariate conditional volatility models, the three most popular are the generalized autoregressive conditional heteroskedasticity (GARCH) model of Engle (1982) and Bollerslev (1986), the GJR (or threshold GARCH) model of Glosten, Jagannathan and Runkle (1992), and the exponential GARCH (or EGARCH) model of Nelson (1990, 1991). For purposes of deriving the mathematical regularity properties, including invertibility, to determine the likelihood function for estimation, and the statistical conditions to establish asymptotic properties, it is convenient to understand the stochastic properties underlying the three univariate models. The random coefficient autoregressive process was used to obtain GARCH by Tsay (1987), an extension of which was used by McAleer (2004) to obtain GJR. A random coefficient complex nonlinear moving average process was used by McAleer and Hafner (2014) to obtain EGARCH. These models can be used to capture asymmetry, which denotes the different effects on conditional volatility of positive and negative effects of equal magnitude, and possibly also leverage, which is the negative correlation between returns shocks and subsequent shocks to volatility (see Black 1979). McAleer (2014) showed that asymmetry was possible for GJR, but not leverage. McAleer and Hafner showed that leverage was not possible for EGARCH. Surprisingly, the conditions for asymmetry in EGARCH seem to have been ignored in the literature, or have concentrated on the incorrect conditions, with no clear explanation, and hence with associated misleading interpretations. The purpose of the paper is to derive the regularity condition for asymmetry in EGARCH to provide the correct interpretation. It is shown that, in practice, EGARCH always displays asymmetry, though not leverage.
    Keywords: Conditional volatility models, Random coefficient complex nonlinear moving average process, EGARCH, Asymmetry, Leverage, Regularity condition.
    JEL: C22 C52 C58 G32
    Date: 2017–06

This nep-ets issue is ©2017 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.