nep-ets New Economics Papers
on Econometric Time Series
Issue of 2014‒12‒24
twelve papers chosen by
Yong Yin
SUNY at Buffalo

  1. A Nonparametric Study of Real Exchange Rate Persistence over a Century By Hyeongwoo Kim; Deockhyun Ryu
  2. Testing against Changing Correlation By Andrew Harvey; Stephen Thiele
  3. On Maximum Likelihood estimation of dynamic panel data models By Maurice J.G. Bun; Martin A. Carree; Arturas Juodis
  4. Empirical Likelihood Confidence Intervals for Nonparametric Nonlinear Nonstationary Regression Models By YABE, Ryota
  5. Asymptotic Distribution of the Conditional Sum of Squares Estimator Under Moderate Deviation From a Unit Root in MA(1) By YABE, Ryota
  6. On the Invertibility of EGARCH By Guillaume Gaetan Martinet; Michael McAleer
  7. Window Selection for Out-of-Sample Forecasting with Time-Varying Parameters By Inoue, Atsushi; Jin, Lu; Rossi, Barbara
  8. Panel Data Analysis with Heterogeneous Dynamics By Ryo Okui; Takahide Yanagi
  9. Unit Root Tests, Size Distortions, and Cointegrated Data By W. Robert Reed
  10. Specific Markov-switching behaviour for ARMA parameters By CARPANTIER, Jean-François; DUFAYS, Arnaud
  11. Forecasting Long Memory Series Subject to Structural Change: A Two-Stage Approach By Gustavo Fruet Dias; Fotis Papailias
  12. On non-standard limits of Brownian semi-stationary By Kerstin Gärtner; Mark Podolskij

  1. By: Hyeongwoo Kim; Deockhyun Ryu
    Abstract: This paper estimates the degree of persistence of 16 long-horizon real exchange rates relative to the US dollar. We use nonparametric operational algorithms by El-Gamal and Ryu (2006) for general nonlinear models based on two statistical notions: the short memory in mean (SMM) and the short memory in distribution (SMD). We found substantially shorter maximum half-life (MHL) estimates than the counterpart from linear models. Our results are robust to the choice of bandwidth with a few exceptions.
    Keywords: Real Exchange Rate; Purchasing Power Parity; Short Memory in Mean; Short-Memory in Distribution; Mixing; Max Half-Life; Max Quarter-Life
    JEL: C14 C15 C22 F31 F41
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:abn:wpaper:auwp2014-15&r=ets
  2. By: Andrew Harvey; Stephen Thiele
    Abstract: A test for time-varying correlation is developed within the framework of a dynamic conditional score (DCS) model for both Gaussian and Student t-distributions. The test may be interpreted as a Lagrange multiplier test and modified to allow for the estimation of models for time-varying volatility in the individual series. Unlike standard moment-based tests, the score-based test statistic includes information on the level of correlation under the null hypothesis and local power arguments indicate the benefits of doing so. A simulation study shows that the performance of the score-based test is strong relative to existing tests across a range of data generating processes. An application to the Hong Kong and South Korean equity markets shows that the new test reveals changes in correlation that are not detected by the standard moment-based test.
    Keywords: Dynamic conditional score, EGARCH, Lagrange multiplier test, Portmanteau test, Time-varying covariance matrices.
    JEL: C14 C22 F36
    Date: 2014–11–28
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:1439&r=ets
  3. By: Maurice J.G. Bun; Martin A. Carree; Arturas Juodis
    Abstract: We analyze the finite sample properties of maximum likelihood estimators for dynamic panel data models. In particular, we consider Transformed Maximum Likelihood (TML) and Random effects Maximum Likelihood (RML) estimation. We show that TML and RML estimators are solutions to a cubic first-order condition in the autoregressive parameter. Furthermore, in finite samples both likelihood estimators might lead to a negative estimate of the variance of the individual specific effects. We consider different approaches taking into account the non-negativity restriction for the variance. We show that these approaches may lead to a boundary solution different from the unique global unconstrained maximum. In an extensive Monte Carlo study we find that this boundary solution issue is non-negligible for small values of T and that different approaches might lead to substantially different finite sample properties. Furthermore, we find that the Likelihood Ratio statistic provides size control in small samples, albeit with low power due to the flatness of the log-likelihood function. We illustrate these issues modeling U.S. state level unemployment dynamics.
    Date: 2014–12–16
    URL: http://d.repec.org/n?u=RePEc:ame:wpaper:1404&r=ets
  4. By: YABE, Ryota
    Abstract: By using the empirical likelihood (EL), we consider the construction of pointwise confidence intervals (CIs) for nonparametric nonlinear nonstationary regression models with nonlinear nonstationary heterogeneous errors. It is well known that the EL-based CI has attractive properties such as data dependency and automatic studentization in cross-sectional and weak-dependence models. We extend EL theory to the nonparametric nonlinear nonstationary regression model and show that the log-EL ratio converges to a chi-squared random variable with one degree of freedom. This means that Wilks' theorem holds even if the covariate follows a nonstationary process. We also conduct empirical analysis of Japan's inverse money demand to demonstrate the data-dependency property of the EL-based CI.
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:hit:econdp:2014-20&r=ets
  5. By: YABE, Ryota
    Abstract: This paper considers the conditional sum of squares estimator (CSSE) for the moderate deviation MA(1) process that has the parameter of the MA(1) with the distance between the parameter and unity being larger than O(T -1). We show that the asymptotic distribution of the CSSE is normal, even though the process belongs to the local-to-unity class. The convergence rate continuously changes from an invertible order to a noninvertible one. In this sense, the moderate deviation process in MA(1) has a continuous bridge property like the AR process.
    Keywords: Moving average, Noninvertible moving average, Unit root, local to unity, Moderate Deviations, Conditional sum of squares estimation
    Date: 2014–12
    URL: http://d.repec.org/n?u=RePEc:hit:econdp:2014-19&r=ets
  6. By: Guillaume Gaetan Martinet (ENSAE Paris Tech, France, and Columbia University, United States.); Michael McAleer (Econometric Institute, Erasmus School of Economics, Erasmus University Rotterdam and Tinbergen Institute, The Netherlands, Department of Quantitative Economics, Complutense University of Madrid, and Institute of Economic Research, Kyoto University.)
    Abstract: Of the two most widely estimated univariate asymmetric conditional volatility models, the exponential GARCH (or EGARCH) specification can capture asymmetry, which refers to the different effects on conditional volatility of positive and negative effects of equal magnitude, and leverage, which refers to the negative correlation between the returns shocks and subsequent shocks to volatility. However, the statistical properties of the (quasi-) maximum likelihood estimator (QMLE) of the EGARCH parameters are not available under general conditions, but only for special cases under highly restrictive and unverifiable conditions. A limitation in the development of asymptotic properties of the QMLE for EGARCH is the lack of an invertibility condition for the returns shocks underlying the model. It is shown in this paper that the EGARCH model can be derived from a stochastic process, for which the invertibility conditions can be stated simply and explicitly. This will be useful in re-interpreting the existing properties of the QMLE of the EGARCH parameters.
    Keywords: Leverage, Asymmetry, Existence, Stochastic process, Asymptotic properties, Invertibility.
    JEL: C22 C52 C58 G32
    Date: 2014
    URL: http://d.repec.org/n?u=RePEc:ucm:doicae:1428&r=ets
  7. By: Inoue, Atsushi; Jin, Lu; Rossi, Barbara
    Abstract: While forecasting is a common practice in academia, government and business alike, practitioners are often left wondering how to choose the sample for estimating forecasting models. When we forecast inflation in 2014, for example, should we use the last 30 years of data or the last 10 years of data? There is strong evidence of structural changes in economic time series, and the forecasting performance is often quite sensitive to the choice of such window size. In this paper, we develop a novel method for selecting the estimation window size for forecasting. Specifically, we propose to choose the optimal window size that minimizes the forecaster's quadratic loss function, and we prove the asymptotic validity of our approach. Our Monte Carlo experiments show that our method performs quite well under various types of structural changes. When applied to forecasting US real output growth and inflation, the proposed method tends to improve upon conventional methods.
    Keywords: forecasting; GDP growth; inflation; instabilities; structural change
    JEL: C22 C52 C53
    Date: 2014–09
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:10168&r=ets
  8. By: Ryo Okui (Institute of Economic Research Kyoto University); Takahide Yanagi (Graduate School of Economics Kyoto University)
    Abstract: This paper proposes the analysis of panel data whose dynamic structure is heterogeneous across individuals. Our aim is to estimate the cross-sectional distributions and/or some distributional features of the heterogeneous mean and autocovariances. We do not assume any specific model for the dynamics. Our proposed method is easy to implement. We first compute the sample mean and autocovariances for each individual and then estimate the parameter of interest based on the empirical distributions of the estimated mean and autocovariances. The asymptotic properties of the proposed estimators are investigated using double asymptotics under which both the cross-sectional sample size (N) and the length of the time series (T) tend to infinity. We prove the functional central limit theorem for the empirical process of the proposed distribution estimator. By using the functional delta method, we also derive the asymptotic distributions of the estimators for various parameters of interest. We show that the distribution estimator exhibits a bias whose order is proportional to 1/√T. Conversely, when the parameter of interest can be written as the expectation of a smooth function of the heterogeneous mean and/or autocovariances, the bias is of order 1/T and can be corrected by the jackknife method. The results of Monte Carlo simulations show that our asymptotic results are informative regarding the finitesample properties of the estimators. They also demonstrate that the proposed jackknife bias correction is successful.
    Keywords: Panel data; heterogeneity; functional central limit theorem; autocovariance; jackknife;long panel.
    JEL: C13 C14 C23
    Date: 2014–11
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:906&r=ets
  9. By: W. Robert Reed (University of Canterbury)
    Abstract: This paper demonstrates that unit root tests can suffer from inflated Type I error rates when data are cointegrated. Results from Monte Carlo simulations show that three commonly used unit root tests – the ADF, Phillips-Perron, and DF-GLS tests – frequently overreject the true null of a unit root for at least one of the cointegrated variables. The findings extend previous research which reports size distortions for unit roots tests when the associated error terms are serially correlated (Schwert, 1989; DeJong et al., 1992; Harris, 1992). While the addition to the Dickey-Fuller-type specification of the correct number of lagged differenced (LD) terms can eliminate the size distortion, I demonstrate that determining the correct number of LD terms is unachievable in practice. Standard diagnostics such as testing for serial correlation in the residuals, and using information criteria to compare different lag specifications, are unable to identify the requird number of lags. A unique feature of this study is that it includes programs (an Excel spreadsheet and Stata .do files) that allow the reader to simulate their own cointegrated data -- using parameters of their own choosing -- to confirm the findings reported in this paper.
    Keywords: Unit root testing, cointegration, DF-GLS test, Augmented Dickey-Fuller test, Phillips-Perron test, simulation
    JEL: C32 C22 C18
    Date: 2014–12–14
    URL: http://d.repec.org/n?u=RePEc:cbt:econwp:14/28&r=ets
  10. By: CARPANTIER, Jean-François (CREA, Université du Luxembourg); DUFAYS, Arnaud (ENSAE-CREST, Paris)
    Abstract: We propose an estimation method that circumvents the path dependence problem existing in Change-Point (CP) and Markov Switching (MS) ARMA models. Our model embeds a sticky infinite hidden Markov-switching structure (sticky IHMM), which makes possible a self-determination of the number of regimes as well as of the specification : CP or MS. Furthermore, CP and MS frameworks usually assume that all the model parameters vary from one regime to another. We relax this restrictive assumption. As illustrated by simulations on moderate samples (300 observations), the sticky IHMM-ARMA algorithm detects which model parameters change over time. Applications to the U.S. GDP growth and the DJIA realized volatility highlight the relevance of estimating different structural breaks for the mean and variance parameters.
    Keywords: Bayesian inference, Markov-switching model, ARMA model, infinite hidden Markov model, Dirichlet Process
    JEL: C11 C15 C22 C58
    Date: 2014–06–11
    URL: http://d.repec.org/n?u=RePEc:cor:louvco:2014014&r=ets
  11. By: Gustavo Fruet Dias (Aarhus University and CREATES); Fotis Papailias (Queen's University Belfast and quantf Research)
    Abstract: A two-stage forecasting approach for long memory time series is introduced. In the first step we estimate the fractional exponent and, applying the fractional differencing operator, we obtain the underlying weakly dependent series. In the second step, we perform the multi-step ahead forecasts for the weakly dependent series and obtain their long memory counterparts by applying the fractional cumulation operator. The methodology applies to stationary and nonstationary cases. Simulations and an application to seven time series provide evidence that the new methodology is more robust to structural change and yields good forecasting results.
    Keywords: Forecasting, Spurious Long Memory, Structural Change, Local Whittle
    JEL: C22 C53
    Date: 2014–12–15
    URL: http://d.repec.org/n?u=RePEc:aah:create:2014-55&r=ets
  12. By: Kerstin Gärtner (Vienna University); Mark Podolskij (Aarhus University and CREATES)
    Abstract: In this paper we present some new asymptotic results for high frequency statistics of Brownian semi-stationary (BSS) processes. More precisely, we will show that singularities in the weight function, which is one of the ingredients of a BSS process, may lead to non-standard limits of the realised quadratic variation. In this case the limiting process is a convex combination of shifted integrals of the intermittency function. Furthermore, we will demonstrate the corresponding stable central limit theorem. Finally, we apply the probabilistic theory to study the asymptotic properties of the realized ratio statistics, which estimates the smoothness parameter of a BSS process.
    Keywords: Brownian semi-stationary processes, high frequency data, limit theorems, stable convergence
    JEL: C10 C13 C14
    Date: 2014–12–10
    URL: http://d.repec.org/n?u=RePEc:aah:create:2014-50&r=ets

This nep-ets issue is ©2014 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.