nep-ets New Economics Papers
on Econometric Time Series
Issue of 2013‒04‒06
nine papers chosen by
Yong Yin
SUNY at Buffalo

  1. Nonlinear Dynamics and Recurrence Plots for Detecting Financial Crisis By Peter Martey Addo; Monica Billio; Dominique Guegan
  2. Forecasting with Non-spurious Factors in U.S. Macroeconomic Time Series By Yohei Yamamoto
  3. Modelling for the Wavelet Coefficients of ARFIMA Processes By Kei Nanamiya
  4. Let's get LADE: robust estimation of semiparametric multiplicative volatility models By Bonsoo Koo; Oliver Linton
  5. Testing for Cointegration in the Presence of Moving Average Errors By Mallory, M.; Lence, Sergio H.
  6. Ten Things You Should Know About DCC By Massimiliano Caporin; Michael McAleer
  7. Granger-Causal analysis of conditional mean and volatility models. By WOŹNIAK, Tomasz
  8. On smoothing macroeconomic time series using HP and modified HP filter By Choudhary, Ali; Hanif, Nadim; Iqbal, Javed
  9. On Size and Power of Heteroscedasticity and Autocorrelation Robust Tests By Preinerstorfer, David; Pötscher, Benedikt M.

  1. By: Peter Martey Addo (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon-Sorbonne, Università Ca' Foscari of Venice - Department of Economics, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris); Monica Billio (Università Ca' Foscari of Venice - Department of Economics); Dominique Guegan (CES - Centre d'économie de la Sorbonne - CNRS : UMR8174 - Université Paris I - Panthéon-Sorbonne, EEP-PSE - Ecole d'Économie de Paris - Paris School of Economics - Ecole d'Économie de Paris)
    Abstract: Identification of financial bubbles and crisis is a topic of major concern since it is important to prevent collapses that can severely impact nations and economies. Our analysis deals with the use of the recently proposed "delay vector variance" (DVV) method, which examines local predictability of a signal in the phase space to detect the presence of determinism and nonlinearity in a time series. Optimal embedding parameters used in the DVV analysis are obtained via a differential entropy based method using wavelet-based surrogates. We exploit the concept of recurrence plots to study the stock market to locate hidden patterns, non-stationarity, and to examine the nature of these plots in events of financial crisis. In particular, the recurrence plots are employed to detect and characterize financial cycles. A comprehensive analysis of the feasibility of this approach is provided. We show that our methodology is useful in the diagnosis and detection of financial bubbles, which have significantly impacted economic upheavals in the past few decades.
    Keywords: Nonlinearity analysis; surrogates; Delay vector variance (DVV) method; wavelets; financial bubbles; embedding parameters; recurrence plots
    Date: 2013–02
    URL: http://d.repec.org/n?u=RePEc:hal:cesptp:halshs-00803450&r=ets
  2. By: Yohei Yamamoto
    Abstract: Time instability in factor loadings can induce an overfitting problem in forecasting analyses since the structural change in factor loadings inflates the number of principal components and thus produces spurious factors. This paper proposes an algorithm to estimate non-spurious factors by identifying the set of observations with stable factor loadings based on the recursive procedure suggested by Inoue and Rossi (2011). I found that 51 out of 132 U.S. macroeconomic time series of Stock and Watson (2005) have stable factor loadings. Although crude principal components provide eight or more factors, there are only one or two non-spurious factors. The forecasts using non-spurious factors significantly improve out-of-sample performance.
    Keywords: dynamic factor model, principal components, structural change, spurious factors, out-of-sample forecasts, overfitting
    JEL: C12 C38 E17
    Date: 2013–02
    URL: http://d.repec.org/n?u=RePEc:hst:ghsdps:gd12-280&r=ets
  3. By: Kei Nanamiya
    Abstract: We consider the model for the discrete nonboundary wavelet coefficients of ARFIMA processes. Although many authors have explained the utility of the wavelet transform for the long dependent processes in semiparametrical literature, there have been a few studies in parametric setting. In this paper, we restrict the Daubechies wavelets filters to make the form of the (general) spectral density function of these coefficients clear.
    Keywords: discrete wavelet transform, long memory process, spectral density function
    JEL: C22
    Date: 2013–02
    URL: http://d.repec.org/n?u=RePEc:hst:ghsdps:gd12-281&r=ets
  4. By: Bonsoo Koo; Oliver Linton (Institute for Fiscal Studies and Cambridge University)
    Abstract: We investigate a model in which we connect slowly time varying unconditional long-run volatility with short-run conditional volatility whose representation is given as a semi-strong GARCH (1,1) process with heavy tailed errors. We focus on robust estimation of both long-run and short-run volatilities. Our estimation is semiparamentric since the long-run volatility is totally unspecified whereas the short-run conditional volatility is a parametric semi-strong GARCH (1,1) process. We propose different robust estimation methods for nonstationary and strictly stationary GARCH parameters with non parametric long-run volatility function. Our estimation is based on a two-step LAD procedure. We establish the relevant asymptotic theory of the proposed estimators. Numerical results lend support to our theoretical results.
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:11/13&r=ets
  5. By: Mallory, M.; Lence, Sergio H.
    Abstract:  This study explores performance of the Johansen cointegration statistics on data containingnegative moving average (NMA) errors. Monte Carlo experiments demonstrate that the asymptoticdistributions of the statistics are sensitive to NMA parameters, and that using the standard 5%asymptotic critical values results in severe underestimation of the actual test sizes. We demonstratethat problems associated with NMA errors do not decrease as sample size increases; instead,they become more severe. Further we examine evidence that many U.S. commodity prices arecharacterized by NMA errors. Pretesting data is recommended before using standard asymptoticcritical values for Johansen’s cointegration tests
    Keywords: cointegration; Johansen cointegration test; moving average
    JEL: C22
    Date: 2012–12–31
    URL: http://d.repec.org/n?u=RePEc:isu:genres:36076&r=ets
  6. By: Massimiliano Caporin (Department of Economics and Management “Marco Fanno”University of Padova Italy); Michael McAleer (Econometric Institute Erasmus School of Economics Erasmus University Rotterdam and Tinbergen Institute The Netherlands and Department of Quantitative Economics Complutense University of Madrid and Institute of Economic Research Kyoto University)
    Abstract: The purpose of the paper is to discuss ten things potential users should know about the limits of the Dynamic Conditional Correlation (DCC) representation for estimating and forecasting time- varying conditional correlations. The reasons given for caution about the use of DCC include the following: DCC represents the dynamic conditional covariances of the standardized residuals, and hence does not yield dynamic conditional correlations; DCC is stated rather than derived; DCC has no moments; DCC does not have testable regularity conditions; DCC yields inconsistent two step estimators; DCC has no asymptotic properties; DCC is not a special case of GARCC, which has testable regularity conditions and standard asymptotic properties; DCC is not dynamic empirically as the effect of news is typically extremely small; DCC cannot be distinguished empirically from diagonal BEKK in small systems; and DCC may be a useful filter or a diagnostic check, but it is not a model.
    Keywords: DCC, BEKK, GARCC, Stated representation, Derived model, Conditional covariances, Conditional correlations, Regularity conditions, Moments, Two step estimators, Assumed properties, Asymptotic properties, Filter, Diagnostic check.
    JEL: C18 C32 C58 G17
    Date: 2013–03
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:854&r=ets
  7. By: WOŹNIAK, Tomasz
    Abstract: Recent economic developments have shown the importance of spillover and contagion effects in financial markets as well as in macroeconomic reality. Such effects are not limited to relations between the levels of variables but also impact on the volatility and the distributions. Granger causality in conditional means and conditional variances of time series is investigated in the framework of several popular multivariate econometric models. Bayesian inference is proposed as a method of assessment of the hypotheses of Granger noncausality. First, the family of ECCC-GARCH models is used in order to perform inference about Granger-causal relations in second conditional moments. The restrictions for second-order Granger noncausality between two vectors of variables are derived. Further, in order to investigate Granger causality in conditional mean and conditional variances of time series VARMA-GARCH models are employed. Parametric restrictions for the hypothesis of noncausality in conditional variances between two groups of variables, when there are other variables in the system as well are derived. These novel conditions are convenient for the analysis of potentially large systems of economic variables. Bayesian testing procedures applied to these two problems, Bayes factors and a Lindley-type test, make the testing possible regardless of the form of the restrictions on the parameters of the model. This approach also enables the assumptions about the existence of higher-order moments of the processes required by classical tests to be relaxed. Finally, a method of testing restrictions for Granger noncausality in mean, variance and distribution in the framework of Markov-switching VAR models is proposed. Due to the nonlinearity of the restrictions derived by Warne (2000), classical tests have limited use. Bayesian inference consists of a novel Block Metropolis-Hastings sampling algorithm for the estimation of the restricted models, and of standard methods of computing posterior odds ratios. The analysis may be applied to financial and macroeconomic time series with changes of parameter values over time and heteroskedasticity.
    Keywords: Subject GARCH model; Bayesian statistical decision theory; Finance -- Econometric models;
    Date: 2012
    URL: http://d.repec.org/n?u=RePEc:ner:euiflo:urn:hdl:1814/25136&r=ets
  8. By: Choudhary, Ali; Hanif, Nadim; Iqbal, Javed
    Abstract: In business cycle research, smoothing data is an essential step in that it can influence the extent to which model-generated moments stand up to their empirical counterparts. To demonstrate this idea, we compare the results of McDermott’s (1997) modified HP-filter with the conventional HP-filter on the properties of simulated and actual macroeconomic series. Our simulations suggest that the modified HP-filter proxies better the true cyclical series. This is true for temporally aggregated data as well. Furthermore, we find that although the autoregressive properties of the smoothed observed series are immune to smoothing procedures, the multivariate analysis is not. As a result, we recommend and hence provide series-, country- and frequency specific smoothing parameters.
    Keywords: Business Cycles; Cross Country Comparisons; Smoothing Parameter; Time Aggregation
    JEL: C32 C43 E32
    Date: 2013–03–28
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:45630&r=ets
  9. By: Preinerstorfer, David; Pötscher, Benedikt M.
    Abstract: Testing restrictions on regression coefficients in linear models often requires correcting the conventional F-test for potential heteroscedasticity or autocorrelation amongst the disturbances, leading to so-called heteroskedasticity and autocorrelation robust test procedures. These procedures have been developed with the purpose of attenuating size distortions and power deficiencies present for the uncorrected F-test. We develop a general theory to establish positive as well as negative finite-sample results concerning the size and power properties of a large class of heteroskedasticity and autocorrelation robust tests. Using these results we show that nonparametrically as well as parametrically corrected F-type tests in time series regression models with stationary disturbances have either size equal to one or nuisance-infimal power equal to zero under very weak assumptions on the covariance model and under generic conditions on the design matrix. In addition we suggest an adjustment procedure based on artificial regressors. This adjustment resolves the problem in many cases in that the so-adjusted tests do not suffer from size distortions. At the same time their power function is bounded away from zero. As a second application we discuss the case of heteroscedastic disturbances.
    Keywords: Size distortion, power deficiency, invariance, robustness, autocorrelation, heteroscedasticity, HAC, fixed-bandwidth, long-run-variance, feasible GLS
    JEL: C12 C20
    Date: 2013–01
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:45675&r=ets

This nep-ets issue is ©2013 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.