nep-ets New Economics Papers
on Econometric Time Series
Issue of 2013‒07‒28
nine papers chosen by
Yong Yin
SUNY at Buffalo

  1. A Nonparametric Study of Real Exchange Rate Persistence over a Century By Hyeongwoo Kim; Deockhyun Ryu
  2. Mixed-correlated ARFIMA processes for power-law cross-correlations By Ladislav Kristoufek
  3. Time series models with an EGB2 conditional distribution By M. Caivano; A. Harvey
  4. We propose a technique to avoid spurious detections of jumps in highfrequency data via an explicit thresholding on available test statistics. By Pierre BAJGROWICZ; Olivier SCAILLET
  5. Selection Criteria in Regime Switching Conditional Volatility Models By Thomas Chuffart
  6. Estimation and Inference for Varying-coefficient Models with Nonstationary Regressors using Penalized Splines By Haiqiang Chen; Ying Fang; Yingxing Li;
  7. Robust Estimation and Inference for Threshold Models with Integrated Regressors By Haiqiang Chen; ; ;
  8. A Test for the Null of Multiple Cointegrating Vectors By Javier Fernandez-Macho
  9. A wavelet approach to multiple cointegration testing By Javier Fernandez-Macho

  1. By: Hyeongwoo Kim; Deockhyun Ryu
    Abstract: This paper estimates the degree of persistence of 16 long-horizon real exchange rates relative to the US dollar. We use nonparametric operational algorithms by El-Gamal and Ryu (2006) for general nonlinear models based on two statistical notions: the short memory in mean (SMM) and the short memory in distribution (SMD). We found substantially shorter maximum half-life (MHL) estimates than the counterpart from linear models, which is robust to the choice of bandwidth with exceptions of Canada and Japan.
    Keywords: Real Exchange Rate; Purchasing Power Parity; Short Memory in Mean; Short-Memory in Distribution; mixing; Max Half-Life; Max Quarter-Life
    JEL: C14 C15 C22 F31 F41
    Date: 2013–07
  2. By: Ladislav Kristoufek
    Abstract: We introduce a general framework of the Mixed-correlated ARFIMA (MC-ARFIMA) processes which allows for various specifications of univariate and bivariate long-term memory. Apart from a standard case when $H_{xy}={1}{2}(H_x+H_y)$, MC-ARFIMA also allows for processes with $H_{xy}<{1}{2}(H_x+H_y)$ but also for long-range correlated processes which are either short-range cross-correlated or simply correlated. The major contribution of MC-ARFIMA lays in the fact that the processes have well-defined asymptotic properties for $H_x$, $H_y$ and $H_{xy}$, which are derived in the paper, so that the processes can be used in simulation studies comparing various estimators of the bivariate Hurst exponent $H_{xy}$. Moreover, the framework allows for modeling of processes which are found to have $H_{xy}<{1}{2}(H_x+H_y)$.
    Date: 2013–07
  3. By: M. Caivano; A. Harvey
    Abstract: A time series model in which the signal is buried in noise that is non-Gaussian may throw up observations that, when judged by the Gaussian yardstick, are outliers. We describe an observation driven model, based on an exponential generalized beta distribution of the second kind (EGB2), in which the signal is a linear function of past values of the score of the conditional distribution. This specification produces a model that is not only easy to implement, but which also facilitates the development of a comprehensive and relatively straight-forward theory for the asymptotic distribution of the maximum likelihood estimator. The model is fitted to US macroeconomic time series and compared with Gaussian and Student-t models. A theory is then developed for an EGARCH model based on the EGB2 distribution and the model is fitted to exchange rate data. Finally dynamic location and scale models are combined and applied to data on the UK rate of inflation.
    Keywords: Beta distribution, EGARCH; fat tails; score; robustness; Student's t; Winsorizing
    JEL: C22 G17
    Date: 2013–07–17
  4. By: Pierre BAJGROWICZ (University of Geneva); Olivier SCAILLET (University of Geneva and Swiss Finance Institute)
    Abstract: We prove that it eliminates asymptotically all spurious detections. Monte Carlo results show that it performs also well in nite samples. In Dow Jones stocks, spurious detections represent up to 50% of the jumps detected initially between 2006 and 2008. For the majority of stocks, jumps do not cluster in time and no cojump aects all stocks simultaneously, suggesting jump risk is diversiable. We relate the remaining jumps to macroeconomic news, prescheduled company-specic announcements, and stories from news agencies which include a variety of unscheduled and uncategarized events. The majority of news do not cause jumps. One exception are share buybacks announcements. Fed rate news have an important impact but rarely cause jumps. Another nding is that 60% of jumps occur without any news event. For one third of the jumps with no news we observe an unusual behavior in the volume of transactions. Hence, liquidity pressures are probably another important factor of jumps.
    Keywords: jumps, high-frequency data, spurious detections, jumps dynamics, news releases, cojumps
    JEL: C58 G12 G14
  5. By: Thomas Chuffart (AMSE - Aix-Marseille School of Economics - Aix-Marseille Univ. - Centre national de la recherche scientifique (CNRS) - École des Hautes Études en Sciences Sociales [EHESS] - Ecole Centrale Marseille (ECM))
    Abstract: A large number of non linear conditional heteroskedastic models have been proposed in the literature and practitioners do not have always the tools to choose the correct specification. In this article, our main interest is to know if usual choice criteria lead them to choose the good specification in regime switching framework. We focus on two types of models: the Logistic Smooth Transition GARCH model and the Markov-Switching GARCH models. Thanks to simulation experiments, we highlight that information criteria and loss functions can lead practitioners to do a misspecification. Indeed, depending on the Data Generating Process used in the experiment, the choice of a criteria to select a model is a difficult issue. We argue that if selection criteria lead to choose the wrong model, it's rather due to the difficulty to estimate such models with Quasi Maximum Likelihood Estimation method (QMLE).
    Keywords: conditional volatility; model selection; GARCH; regime switching
    Date: 2013–07
  6. By: Haiqiang Chen; Ying Fang; Yingxing Li;
    Abstract: This paper considers estimation and inference for varying-coefficient models with nonstationary regressors. We propose a nonparametric estimation method using penalized splines, which achieves the same optimal convergence rate as kernel-based methods, but enjoys computation advantages. Utilizing the mixed model representation of penalized splines, we develop a likelihood ratio test statistic for checking the stability of the regression coefficients. We derive both the exact and the asymptotic null distributions of this test statistic. We also demonstrate its optimality by examining its local power performance. These theoretical findings are well supported by simulation studies.
    Keywords: Nonstationary Time Series; Varying-coe±cient Model; Likelihood Ratio Test; Penalized Splines
    JEL: C12 C14 C22
    Date: 2013–07
  7. By: Haiqiang Chen; ; ;
    Abstract: This paper studies the robust estimation and inference of threshold models with integrated regres- sors. We derive the asymptotic distribution of the profiled least squares (LS) estimator under the diminishing threshold effect assumption that the size of the threshold effect converges to zero. Depending on how rapidly this sequence converges, the model may be identified or only weakly identified and asymptotic theorems are developed for both cases. As the convergence rate is unknown in practice, a model-selection procedure is applied to determine the model identification strength and to construct robust confidence intervals, which have the correct asymptotic size irrespective of the magnitude of the threshold effect. The model is then generalized to incorporate endogeneity and serial correlation in error terms, under which, we design a Cochrane-Orcutt feasible generalized least squares (FGLS) estimator which enjoys efficiency gains and robustness against different error specifications, including both I(0) and I(1) errors. Based on this FGLS estimator, we further develop a sup-Wald statistic to test for the existence of the threshold effect. Monte Carlo simulations show that our estimators and test statistics perform well.
    Keywords: Threshold effects; Integrated processes; Nonlinear cointegration; Weak identification.
    JEL: C12 C22 C52
    Date: 2013–07
  8. By: Javier Fernandez-Macho
    Abstract: This paper examines a test for the null of cointegration in a multivariate system based on the discrepancy between the OLS estimator of the full set of n cointegrating relationships in the n + k system and the OLS estimator of the corresponding relationships among first differences without making specific assumptions about the short-run dynamics of the multivariate data generating process.  It is shown that the proposed test statistics are asymptotically distributed as standard chi-square with n + k degrees of freedom and are not affected by the inclusion of deterministic terms or dynamic regressors, thus offering a simple way of testing for cointegration under the null without the need of special tables.  Small sample critical values for these statistics are tabulated using Monte Carlo simulation and it is shown that these non residual-based tests exhibit appropriate size and good power even for quite general error dynamics.  In fact, simulation results suggest that they perform quite reasonably when compared to other tests of the null of cointegration.
    Keywords: Brownian motion, cointegration, econometric methods, integrated process, multivariate analysis, time series models, unit root
    JEL: C22 C12
    Date: 2013–06–01
  9. By: Javier Fernandez-Macho
    Abstract: This paper introduces a class of cointegration tests based on estimated low-pass and high-pass regression coefficients from the same wavelet transform of the original time series data.  The procedure can be applied to test the null of cointegration in a n + k multivariate system with n cointegrating relationships without the need of either detrending nor differencing.  The proposed non residual-based wavelet statistics are asymptotically distributed as standard chi-square with nk degrees of freedom regardless of deterministic terms or dynamic regressors, thus offering a simple way of testing for cointegration under the null without the need of special tables.  Small sample quantiles for these wavelet statistics are obtained using Monte Carlo simulation in different situations including I(1) and higher order cointegration cases and it is shown that these wavelet tests exhibit appropriate size and good power when compared to other tests of the null of cointegration.
    Keywords: Brownian motion, cointegration, econometric methods, integrated process, multivariate analysis, spectral analysis, time series models, unit roots, wavelet analysis
    JEL: C22 C12
    Date: 2013–07–11

This nep-ets issue is ©2013 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.