nep-ets New Economics Papers
on Econometric Time Series
Issue of 2015‒07‒04
eleven papers chosen by
Yong Yin
SUNY at Buffalo

  1. Level Shifts and Long Memory: a State Space Approach By Davide Delle Monache; Stefano Grassi; Paolo Santucci
  2. Impact of non-stationarity on estimating and modeling empirical copulas of daily stock returns By Marcel Wollschl\"ager; Rudi Sch\"afer
  3. Detrended fluctuation analysis made flexible to detect range of cross-correlated fluctuations By Jaroslaw Kwapien; Pawel Oswiecimka; Stanislaw Drozdz
  4. Uniform Convergence Rates over Maximal Domains in Structural Nonparametric Cointegrating Regression By James A. Duffy
  5. Robust Forecast Comparison By Sainan Jin; Valentina Corradi; Norman Swanson
  6. A comparative Study of Volatility Breaks By Grote, Claudia; Bertram, Philip
  7. New Fractional Dickey and Fuller Test By Bensalma, Ahmed
  8. Testing Mean Stability of Heteroskedastic Time Series By Violetta Dalla; Liudas Giraitis; Peter C. B. Phillips
  9. Business Cycles, Trend Elimination, and the HP Filter By Peter C. B. Phillips; Sainan Jin
  10. Pitfalls and Possibilities in Predictive Regression By Peter C. B. Phillips
  11. Revisiting the transitional dynamics of business-cycle phases with mixed frequency data By Bessec, Marie

  1. By: Davide Delle Monache; Stefano Grassi; Paolo Santucci
    Abstract: Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes it hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based on the autocorrelation function or the periodogram. In this paper, we propose a robust testing procedure, based on an encompassing parametric specification that allows us to disentangle the level shifts from the fractionally integrated component. The estimation is carried out on the basis of a state-space methodology and it leads to a robust estimate of the fractional integration parameter also in presence of level shifts. Once the memory parameter is correctly estimated, we use the KPSS test for presence of level shift. The Monte Carlo simulations show how this approach produces unbiased estimates of the memory parameter when shifts in the mean, or other slowly varying trends, are present in the data. Therefore, this subsequent robust version of the KPSS test for the presence of level shifts has proper size and by far the highest power compared to other existing tests. Finally, we illustrate the usefulness of the proposed approach on financial data, such as daily bipower variation and turnover.
    Keywords: Long Memory; Fractional Integration; Level Shifts; State-Space methods; KPSS test
    JEL: C10 C11 C22 C80
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:ukc:ukcedp:1511&r=ets
  2. By: Marcel Wollschl\"ager; Rudi Sch\"afer
    Abstract: All too often measuring statistical dependencies between financial time series is reduced to a linear correlation coefficient. However this may not capture all facets of reality. We study empirical dependencies of daily stock returns by their pairwise copulas. Here we investigate particularly to which extent the non-stationarity of financial time series affects both the estimation and the modeling of empirical copulas. We estimate empirical copulas from the non-stationary, original return time series and stationary, locally normalized ones. Thereby we are able to explore the empirical dependence structure on two different scales: a global and a local one. Additionally the asymmetry of the empirical copulas is emphasized as a fundamental characteristic. We compare our empirical findings with a single Gaussian copula, with a correlation-weighted average of Gaussian copulas, with the K-copula directly addressing the non-stationarity of dependencies as a model parameter, and with the skewed Student's t-copula. The K-copula covers the empirical dependence structure on the local scale most adequately, whereas the skewed Student's t-copula best captures the asymmetry of the empirical copula on the global scale.
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1506.08054&r=ets
  3. By: Jaroslaw Kwapien; Pawel Oswiecimka; Stanislaw Drozdz
    Abstract: The detrended cross-correlation coefficient $\rho_{\rm DCCA}$ has recently been proposed to quantify the strength of cross-correlations on different temporal scales in bivariate, non-stationary time series. It is based on the detrended cross-correlation and detrended fluctuation analyses (DCCA and DFA, respectively) and can be viewed as an analogue of the Pearson coefficient in the case of the fluctuation analysis. The coefficient $\rho_{\rm DCCA}$ works well in many practical situations but by construction its applicability is limited to detection of whether two signals are generally cross-correlated, without possibility to obtain information on the amplitude of fluctuations that are responsible for those cross-correlations. In order to introduce some related flexibility, here we propose an extension of $\rho_{\rm DCCA}$ that exploits the multifractal versions of DFA and DCCA: MFDFA and MFCCA, respectively. The resulting new coefficient $\rho_q$ not only is able to quantify the strength of correlations, but also it allows one to identify the range of detrended fluctuation amplitudes that are correlated in two signals under study. We show how the coefficient $\rho_q$ works in practical situations by applying it to stochastic time series representing processes with long memory: autoregressive and multiplicative ones. Such processes are often used to model signals recorded from complex systems and complex physical phenomena like turbulence, so we are convinced that this new measure can successfully be applied in time series analysis. In particular, we present an example of such application to highly complex empirical data from financial markets. The present formulation can straightforwardly be extended to multivariate data in terms of the $q$-dependent counterpart of the correlation matrices and then to the network representation.
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1506.08692&r=ets
  4. By: James A. Duffy (Institute for New Economic Thinking, Oxford Martin School, and Economics Department, University of Oxford)
    Abstract: This paper presents uniform convergence rates for kernel regression estimators, in the setting of a structural nonlinear cointegrating regression model. We generalise the existing literature in three ways. First, the domain to which these rates apply is much wider than has been previously considered, and can be chosen so as to contain as large a fraction of the sample as desired in the limit. Second, our results allow the regression disturbance to be serially correlated, and cross-correlated with the regressor; previous work on this problem (of obtaining uniform rates) having been confined entirely to the setting of an exogenous regressor. Third, we permit the bandwidth to be data-dependent, requiring only that it satisfy certain weak asymptotic shrinkage conditions. Our assumptions on the regressor process are consistent with a very broad range of departures from the standard unit root autoregressive model, allowing the regressor to be fractionally integrated, and to have an infinite variance (and even infinite lower-order moments).
    Date: 2015–05–05
    URL: http://d.repec.org/n?u=RePEc:nuf:econwp:1503&r=ets
  5. By: Sainan Jin (Singapore Management University); Valentina Corradi (University of Surrey); Norman Swanson (Rutgers University)
    Abstract: Forecast accuracy is typically measured in terms of a given loss function. However, as a consequence of the use of misspecified models in multiple model comparisons, relative forecast rankings are loss function dependent. This paper addresses this issue by using a novel criterion for forecast evaluation which is based on the entire distribution of forecast errors. We introduce the concepts of general-loss (GL) forecast superiority and convex-loss (CL) forecast superiority, and we establish a mapping between GL (CL) superiority and first (second) order stochastic dominance. This allows us to develop a forecast evaluation procedure based on an out-of-sample generalization of the tests introduced by Linton, Maasoumi and Whang (2005). The asymptotic null distributions of our test statistics are nonstandard, and resampling procedures are used to obtain the critical values. Additionally, the tests are consistent and have nontrivial local power under a sequence of local alternatives. In addition to the stationary case, we outline theory extending our tests to the case of heterogeneity induced by distributional change over time. Monte Carlo simulations suggest that the tests perform reasonably well in finite samples; and an application to exchange rate data indicates that our tests can help identify superior forecasting models, regardless of loss function.
    Keywords: convex loss function, empirical processes, forecast superiority, general loss function
    JEL: C12 C22
    Date: 2015–05–13
    URL: http://d.repec.org/n?u=RePEc:rut:rutres:201502&r=ets
  6. By: Grote, Claudia; Bertram, Philip
    Abstract: In this paper we evaluate the performance of several structural break tests under various DGPs. Concretely we look at size and power properties of CUSUM based, LM and Wald volatility break tests. In a simulation study we derive the properties of the tests under shifts in the unconditional and conditional variance as well as for smooth shifts in the volatility process. Our results indicate that Wald tests have more power of detecting a change in the volatility than CUSUM and LM tests. This, however, goes along with the disadvantage of being slightly oversized. We further show that with huge outliers in the data the tests may exhibit non-monotonic power functions as the long-run variance of the squared return process is no longer finite. In an empirical example we determine the number and time of volatility breaks considering four equity and three exchange rate series. We find that in some situations the outcomes of the tests may vary substantially. Further we find fewer volatility breaks in the currency series than in the equity series.
    Keywords: Structural Breaks, Variance Shifts, Non-Monotonic Power
    JEL: C22 C52 C53
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:han:dpaper:dp-558&r=ets
  7. By: Bensalma, Ahmed
    Abstract: The aim of this paper is motivated by the following question: “If a series were best characterized by fractional process, would a researcher be able to detect that fact by using conventional Dickey-Fuller (1979) test?” To answer this question, in simple framework, we propose a new fractional Dickey-Fuller (F-DF) test, different from the test of Dolado, Gonzalo and Mayoral (2002).
    Keywords: Fractional unit root, Dickey-Fuller Test, Fractional integration parameter.
    JEL: C1 C22 C4 C51 C58 E2 E5
    Date: 2015–05–27
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:65282&r=ets
  8. By: Violetta Dalla (National and Kapodistrian University of Athens); Liudas Giraitis (Queen Mary, London University); Peter C. B. Phillips (Cowles Foundation, Yale University)
    Abstract: Time series models are often fitted to the data without preliminary checks for stability of the mean and variance, conditions that may not hold in much economic and financial data, particularly over long periods. Ignoring such shifts may result in fitting models with spurious dynamics that lead to unsupported and controversial conclusions about time dependence, causality, and the effects of unanticipated shocks. In spite of what may seem as obvious differences between a time series of independent variates with changing variance and a stationary conditionally heteroskedastic (GARCH) process, such processes may be hard to distinguish in applied work using basic time series diagnostic tools. We develop and study some practical and easily implemented statistical procedures to test the mean and variance stability of uncorrelated and serially dependent time series. Application of the new methods to analyze the volatility properties of stock market returns leads to some unexpected surprising findings concerning the advantages of modeling time varying changes in unconditional variance.
    Keywords: Heteroskedasticity, KPSS test, Mean stability, Variance stability, VS test
    JEL: C22 C23
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2006&r=ets
  9. By: Peter C. B. Phillips (Cowles Foundation, Yale University); Sainan Jin (Singapore Management University)
    Abstract: We analyze trend elimination methods and business cycle estimation by data filtering of the type introduced by Whittaker (1923) and popularized in economics in a particular form by Hodrick and Prescott (1980/1997; HP). A limit theory is developed for the HP filter for various classes of stochastic trend, trend break, and trend stationary data. Properties of the filtered series are shown to depend closely on the choice of the smoothing parameter (lambda). For instance, when lambda = O(n^4) where n is the sample size, and the HP filter is applied to an I(1) process, the filter does not remove the stochastic trend in the limit as n approaches infinity. Instead, the filter produces a smoothed Gaussian limit process that is differentiable to the 4'th order. The residual 'cyclical' process has the random wandering non-differentiable characteristics of Brownian motion, thereby explaining the frequently observed 'spurious cycle' effect of the HP filter. On the other hand, when lambda = o(n), the filter reproduces the limit Brownian motion and eliminates the stochastic trend giving a zero 'cyclical' process. Simulations reveal that the lambda = O(n^4) limit theory provides a good approximation to the actual HP filter for sample sizes common in practical work. When it is used as a trend removal device, the HP filter therefore typically fails to eliminate stochastic trends, contrary to what is now standard belief in applied macroeconomics. The findings are related to recent public debates about the long run effects of the global financial crisis.
    Keywords: Detrending, Graduation, Hodrick Prescott filter, Integrated process, Limit theory, Smoothing, Trend break, Whittaker filter
    JEL: C32
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2005&r=ets
  10. By: Peter C. B. Phillips (Cowles Foundation, Yale University)
    Abstract: Financial theory and econometric methodology both struggle in formulating models that are logically sound in reconciling short run martingale behaviour for financial assets with predictable long run behavior, leaving much of the research to be empirically driven. The present paper overviews recent contributions to this subject, focussing on the main pitfalls in conducting predictive regression and on some of the possibilities offered by modern econometric methods. The latter options include indirect inference and techniques of endogenous instrumentation that use convenient temporal transforms of persistent regressors. Some additional suggestions are made for bias elimination, quantile crossing amelioration, and control of predictive model misspecification.
    Keywords: Bias, Endogenous instrumentation, Indirect inference, IVX estimation, Local unit roots, Mild integration, Prediction, Quantile crossing, Unit roots, Zero coverage probability
    JEL: C22 C23
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:2003&r=ets
  11. By: Bessec, Marie
    Abstract: This paper introduces a Markov-Switching model where transition probabilities depend on higher frequency indicators and their lags, through polynomial weighting schemes. The MSV-MIDAS model is estimated via maximum likel ihood methods. The estimation relies on a slightly modified version of Hamilton’s recursive filter. We use Monte Carlo simulations to assess the robustness of the estimation procedure and related test-statistics. The results show that ML provides accurate estimates, but they suggest some caution in the tests on the parameters involved in the transition probabilities. We apply this new model to the detection and forecast of business cycle turning points. We properly detect recessions in United States and United Kingdom by exploiting the link between GDP growth and higher frequency variables from financial and energy markets. Spread term is a particularly useful indicator to predict recessions in the United States, while stock returns have the strongest explanatory power around British turning points.
    Keywords: Markov-Switching; mixed frequency data; business cycles;
    JEL: C22 E32 E37
    Date: 2015–06
    URL: http://d.repec.org/n?u=RePEc:dau:papers:123456789/15246&r=ets

This nep-ets issue is ©2015 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.