nep-ets New Economics Papers
on Econometric Time Series
Issue of 2017‒10‒29
seven papers chosen by
Yong Yin
SUNY at Buffalo

  1. Estimation of Structural Impulse Responses: Short-Run versus Long-run Identifying Restrictions By Lütkepohl, Helmut; Staszewska-Bystrova, Anna; Winker, Peter
  2. Robust Maximum Likelihood Estimation of Sparse Vector Error Correction Model By Ziping Zhao; Daniel P. Palomar
  3. Trends and Cycles in Macro Series: The Case of US Real GDP By Guglielmo Maria Caporale; Luis A. Gil-Alana
  4. The Local Power of the IPS Test with Both Initial Conditions and Incidental Trends By Kajal Lahiri; Zhongwen Liang; Huaming Peng
  5. Use of unit root methods in early warning of financial crises By Timo Virtanen; Eero Tölö; Matti Virén; Katja Taipalus
  6. State Space Approach to Adaptive Fuzzy Modeling: Application to Financial Investment By Masafumi Nakano; Akihiko Takahashi; Soichiro Takahashi
  7. The Asymptotic Validity of "Standard" Fully Modified OLS Estimation and Inference in Cointegrating Polynomial Regressions By Stypka, Oliver; Wagner, Martin; Grabarczyk, Peter; Kawka, Rafael

  1. By: Lütkepohl, Helmut; Staszewska-Bystrova, Anna; Winker, Peter
    Abstract: There is evidence that estimates of long-run impulse responses of structural vector autoregressive (VAR) models based on long-run identifying restrictions may not be very accurate. We compare structural VAR impulse response estimates based on long-run and short-run identifying restrictions and find that long-run identifying restrictions can result in much more precise estimates for the structural impulse responses than restrictions on the impact effects of the shocks.
    JEL: C32
    Date: 2017
  2. By: Ziping Zhao; Daniel P. Palomar
    Abstract: In econometrics and finance, the vector error correction model (VECM) is an important time series model for cointegration analysis, which is used to estimate the long-run equilibrium variable relationships. The traditional analysis and estimation methodologies assume the underlying Gaussian distribution but, in practice, heavy-tailed data and outliers can lead to the inapplicability of these methods. In this paper, we propose a robust model estimation method based on the Cauchy distribution to tackle this issue. In addition, sparse cointegration relations are considered to realize feature selection and dimension reduction. An efficient algorithm based on the majorization-minimization (MM) method is applied to solve the proposed nonconvex problem. The performance of this algorithm is shown through numerical simulations.
    Date: 2017–10
  3. By: Guglielmo Maria Caporale; Luis A. Gil-Alana
    Abstract: In this paper we propose a new modelling framework for the analysis of macro series that includes both stochastic trends and stochastic cycles in addition to deterministic terms such as linear and non-linear trends. We examine four US macro series, namely annual and quarterly real GDP and GDP per capita. The results indicate that the behaviour of US GDP can be captured accurately by a model incorporating both stochastic trends and stochastic cycles that allows for somedegree of persistence in the data. Both appear to be mean-reverting, although the stochastic trend is nonstationary whilst the cyclical component is stationary, with cycles repeating themselves every 6 – 10 years.
    Keywords: GDP, GDP per capita, trends, cycles, long memory, fractional integration
    JEL: C22 E32
    Date: 2017
  4. By: Kajal Lahiri; Zhongwen Liang; Huaming Peng
    Abstract: This paper investigates the asymptotic local power of the the averaged t-test of Im, Pesaran and Shin (2003, IPS hereafter) in the presence of both initial explosive conditions and incidental trends. By utilizing the least squares detrending methods, it is found that the initial condition plays no role in determining the asymptotic local power of the IPS test, a result strikingly different from the finding in Harris et al. (2010), who examined the impact of the initial conditions on local power of IPS test without incidental trends. The paper also presents, via an application of the Fredholm method discussed in Nabeya and Tanaka (1990a, 1990b), the exact asymptotic local power of IPS test, thereby providing theoretical justifications for its lack of asymptotic local power in the neighborhood of unity with the order of N-1/2T-1 while attaining nontrivial power in the neighborhood of unity that shrinks at the rate N-1/4T-1. This latter finding is consistent with Moon et al. (2007) and extends their results to IPS test. It is also of practical significance to empirical researchers as the presence of incidental trends in panel unit root test setting is ubiquitous.
    Keywords: panel data, unit root test, individual heterogeneity
    JEL: C13 C22 C23
    Date: 2017
  5. By: Timo Virtanen; Eero Tölö; Matti Virén; Katja Taipalus
    Abstract: In several recent studies unit root methods have been used in detection of financial bubbles in asset prices. The basic idea is that fundamental changes in the autocorrelation structure of relevant time series imply the presence of a rational price bubble. We provide cross-country evidence for performance of unit-root-based early warning systems in ex-ante prediction of financial crises in 15 EU countries over the past three decades. We find especially high performance for time series that are explicitly related to debt, which issue signals a few years in advance of a crisis. Combining signals from multiple time series further improves the predictions. Our results suggest an early warning tool based on unit root methods provides a valuable accessory in financial stability supervision. JEL Classification: G01, G14, G21
    Keywords: financial crises, unit root, combination of forecasts
    Date: 2017–06
  6. By: Masafumi Nakano (Graduate School of Economics, the University of Tokyo); Akihiko Takahashi (Graduate School of Economics, the University of Tokyo); Soichiro Takahashi (Graduate School of Economics, the University of Tokyo)
    Abstract: This paper proposes a new state space approach to adaptive fuzzy modeling under the dynamic environment, where Bayesian filtering sequentially learns the model parameters including model structures themselves as state variables. In particular, our approach specifies the state transitions as meanreversion processes, which intends to incorporate and extend the established state-of-art learning techniques as follows: First, the mean-reversion levels of model parameters are determined by applying some existing learning method to a training period. Next, filtering implementation over test data enables on-line estimation of the parameters, where the estimates are adaptively tuned for each new data arrival based on the obtained reliable learning result. In this work, we concretely design a Takagi-Sugeno- Kang fuzzy model for financial investment, whose parameters follow autoregressive processes with the mean-reversion levels decided by particle swarm optimization. Since there exist Monte Carlo simulation-based algorithms called particle filtering, our methodology is applicable to a quite general setting including non-linearity, which actually arises in our investment problem. Then, an out-of-sample numerical experiment with security price data successfully demonstrates its effectiveness.
    Date: 2017–10
  7. By: Stypka, Oliver (Faculty of Statistics, Technical University Dortmund); Wagner, Martin (Faculty of Statistics, Technical University Dortmund, Institute for Advanced Studies, Vienna and Bank of Slovenia, Ljubljana); Grabarczyk, Peter (Faculty of Statistics, Technical University Dortmund); Kawka, Rafael (Faculty of Statistics, Technical University Dortmund)
    Abstract: The paper considers estimation and inference in cointegrating polynomial regressions, i. e., regressions that include deterministic variables, integrated processes and their powers as explanatory variables. The stationary errors are allowed to be serially correlated and the regressors are allowed to be endogenous. The main result shows that estimating such relationships using the Phillips and Hansen (1990) fully modified OLS approach developed for linear cointegrating relationships by incorrectly considering all integrated regressors and their powers as integrated regressors leads to the same limiting distribution as theWagner and Hong (2016) fully modified type estimator developed for cointegrating polynomial regressions. A key ingredient for the main result are novel limit results for kernel weighted sums of properly scaled nonstationary processes involving scaled powers of integrated processes. Even though the simulation results indicate performance advantages of the Wagner and Hong (2016) estimator that are partly present even in large samples, the results of the paper drastically enlarge the useability of the Phillips and Hansen (1990) estimator as implemented in many software packages.
    Keywords: Cointegrating Polynomial Regression, Cointegration Test, Environmental Kuznets Curve, Fully Modified OLS Estimation, Integrated Process, Nonlinearity
    JEL: C13 C32
    Date: 2017–10

This nep-ets issue is ©2017 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.