nep-ets New Economics Papers
on Econometric Time Series
Issue of 2013‒08‒10
six papers chosen by
Yong Yin
SUNY at Buffalo

  1. Asymmetric volatility spillovers: Revisiting the Diebold-Yilmaz (2009) spillover index with realized semivariance By Jozef Barunik; Evzen Kocenda; Lukas Vacha
  2. A comparison of sequential and information-based methods for determining the co-integration rank in heteroskedastic VAR models By Giuseppe Cavaliere; Luca De Angelis; Anders Rahbek; robert.taylor@nottingham.ac.uk
  3. A comparison between Tau-d and the procedure TRAMO-SEATS is also included. By Gabriel Rodriguez; Dionisio Ramirez
  4. A Comparative Note About Estimation of the Fractional Parameter under Additive Outliers By Gabriel Rodriguez
  5. A Note on the Size of the ADF Test with Additive Outliers and Fractional Errors. A Reapraisal about the (Non) Stationarity of the Latin-American Inflation Series. By Gabriel Rodriguez; Dionisio Ramirez
  6. Adapting the Hodrick-Prescott Filter for Very Small Open Economies By Grech, Aaron George

  1. By: Jozef Barunik; Evzen Kocenda; Lukas Vacha
    Abstract: Based on the negative and positive realized semivariances developed in Barndorff-Nielsen et al. (2010), we modify the volatility spillover index devised in Diebold and Yilmaz (2009). The resulting asymmetric volatility spillover indices are easy to compute and account well for negative and positive parts of volatility. We apply the modified indices on the 30 U.S. stocks with the highest market capitalization over the period 2004-2011 to study intra-market spillovers. We provide evidence of sizable volatility-spillover asymmetries and a markedly different pattern of spillovers during periods of economic ups and downs.
    Date: 2013–08
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1308.1221&r=ets
  2. By: Giuseppe Cavaliere (Università di Bologna); Luca De Angelis (Università di Bologna); Anders Rahbek (University of Copenhagen); robert.taylor@nottingham.ac.uk (University of Nottingham)
    Abstract: In this paper we investigate the behaviour of a number of methods for estimating the co-integration rank in VAR systems characterized by heteroskedastic innovation processes. In particular we compare the efficacy of the most widely used information criteria, such as AIC and BIC, with the commonly used sequential approach of Johansen (1996) based around the use of either asymptotic or wild bootstrap-based likelihood ratio type tests. Complementing recent work done for the latter in Cavaliere, Rahbek and Taylor (2013, Econometric Reviews, forthcoming), we establish the asymptotic properties of the procedures based on information criteria in the presence of heteroskedasticity (conditional or unconditional) of a quite general and unknown form. The relative finite-sample properties of the different methods are investigated by means of a Monte Carlo simulation study. For the simulation DGPs considered in the analysis, we find that the BIC-based procedure and the bootstrap sequential test procedure deliver the best overall performance in terms of their frequency of selecting the correct co-integration rank across different values of the co-integration rank, sample size, stationary dynamics and models of heteroskedasticity. Of these the wild bootstrap procedure is perhaps the more reliable overall since it avoids a significant tendency seen in the BIC-based method to over-estimate the co-integration rank in relatively small sample sizes.
    Keywords: Cointegrazione; Wild bootstrap; Statistic traccia; Criteri di informazione; Determinazione rango; Eteroschedasticità. Co-integration; Wild bootstrap; Trace statistic; Information criteria; Rank determi- nation; Heteroskedasticity.
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:bot:quadip:121&r=ets
  3. By: Gabriel Rodriguez (Departamento de Economía - Pontificia Universidad Católica del Perú); Dionisio Ramirez (Universidad Castilla La Mancha)
    Abstract: Perron and Rodríguez (2003) claimed that their procedure to detect for additive outliers (Tau-d) is powerful even when we have departures from the unit root case. In this note, we use Monte-Carlo simulations to show that Tau-d is powerful when we have ARFIMA(p; d; q) errors. Using simulations, we calculate the expected number of additive outliers found in this context and the number of times that the approach Tau-d identi…es the true location of the additive outliers. The results indicate that the power of the procedure Tau-d depends of the size of the additive outliers. When we have a DGP with big sized additive outliers the percentage of time that Tau-d detects correctly the location of the additive outliers is 100.0%.
    Keywords: Additive Outliers, ARFIMA Errors, Detection of Additive Out-liers.
    JEL: C2 C3 C5
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:pcp:pucwps:wp00355&r=ets
  4. By: Gabriel Rodriguez (Departamento de Economía - Pontificia Universidad Católica del Perú)
    Abstract: In a recent paper, Fajardo et al. (2009) propose an alternative semiparametric estimator of the fractional parameter in ARFIMA models which is robust to the presence of additive outliers. The results are very interesting, however, they use samples of 300 or 800 observations which are rarely found in macroeconomics or economics. In order to perform a comparison, I use the procedure to detect for additive outliers based on the estimator Tau- d suggested by Perron and Rodríguez (2003). Further, I use dummy variables associated to the location of the selected outliers to estimate the fractional parameter. I found better results for the mean and bias of this parameter when T = 100 and the results in terms of the standard deviation and the MSE are very similar. However, for higher sample sizes as 300 or 800, the robust procedure performs better, specially based on the standard deviation and MSE measures. Empirical applications for seven Latin American ination series with very small sample sizes contaminated by additive outliers is discussed. What we …nd is that when no correction for additive outliers is performed, the fractional parameter is underestimated.
    Keywords: Additive Outliers, ARFIMA Errors, semiparametric estimation.
    JEL: C2 C3 C5
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:pcp:pucwps:wp00356&r=ets
  5. By: Gabriel Rodriguez (Departamento de Economía - Pontificia Universidad Católica del Perú); Dionisio Ramirez (Universidad Castilla La Mancha)
    Abstract: This note analyzes the empirical size of the augmented Dickey and Fuller (ADF) statistic proposed by Perron and Rodríguez (2003) when the errors are frac- tional. This ADF is based on a searching procedure for additive outliers based on …rst-differences of the data named Tau- d. Simulations show that empirical size of the ADF is not affected by fractional errors con…rming the claim of Perron and Rodríguez (2003) that the procedure Tau-d is robust to departures of the unit root framework. In particular the results show low sensitivity of the size of the ADF statistic respect to the fractional parameter (d). However, as expected, when there is strong negative moving average autocorrelation or negative au- toregressive autocorrelation, the ADF statistic is oversized. These difficulties are …xed when sample increases (from T = 100 to T = 200). Empirical applica- tion to eight quarterly Latin-American ination series is also provided showing the importance of taking into account dummy variables for the detected additive outliers.
    Keywords: Additive Outliers, ARFIMA Errors, ADF test
    JEL: C2 C3 C5
    Date: 2013
    URL: http://d.repec.org/n?u=RePEc:pcp:pucwps:wp00357&r=ets
  6. By: Grech, Aaron George
    Abstract: The Hodrick-Prescott (HP) filter is a commonly used method, particularly in potential output studies. However its suitability depends on a number of conditions. Very small open economies do not satisfy these as their macroeconomic series exhibit pronounced trends, large fluctuations and recurrent breaks. Consequently the use of the filter results in random changes in the output gap that are out of line with the concept of equilibrium. Two suggestions are put forward. The first involves defining the upper and lower bounds of a series and determining equilibrium as a weighted average of the filter applied separately on these bounds. The second involves an integration of structural features into the standard filter to allow researchers to set limits on the impact of structural/temporary shocks and allow for lengthy periods of disequilibria. This paper shows that these methods can result in a smoother output gap series for the smallest Euro Area economies.
    Keywords: Potential output, output gap, Hodrick-Prescott filter, detrending, business cycles, small open economies
    JEL: B41 C1 E32 F41
    Date: 2013–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:48803&r=ets

This nep-ets issue is ©2013 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.