nep-ets New Economics Papers
on Econometric Time Series
Issue of 2016‒01‒18
twelve papers chosen by
Yong Yin
SUNY at Buffalo

  1. Maximum Likelihood Estimation of Time-Varying Loadings in High-Dimensional Factor Models By Jakob Guldbæk Mikkelsen; Eric Hillebrand; Giovanni Urga
  2. Fixed-b Inference in the Presence of Time-Varying Volatility By Matei Demetrescu; Christoph Hanck; Robinson Kruse
  3. Long memory and multifractality: A joint test By John Goddard; Enrico Onali
  4. Irreversibility of financial time series: a graph-theoretical approach By Lucas Lacasa; Ryan Flanagan
  5. The varying coefficient Bayesian panel VAR model By Wieladek, Tomasz
  6. A Fixed-bandwidth View of the Pre-asymptotic Inference for Kernel Smoothing with Time Series Data By Kim, Min Seong; Sun, Yixiao; Yang, Jingjing
  7. Simple, Robust, and Accurate F and t Tests in Cointegrated Systems By Hwang, Jungbin; Sun, Yixiao
  8. Forecasting hierarchical and grouped time series through trace minimization By Shanika L Wickramasuriya; George Athanasopoulos; Rob J Hyndman
  9. ZD-GARCH model: a new way to study heteroscedasticity By Li, Dong; Ling, Shiqing; Zhu, Ke
  10. Alternative GMM Estimators for First-order Autoregressive Panel Model: An Improving Efficiency Approach By Youssef, Ahmed; Abonazel, Mohamed R.
  11. Improving the Efficiency of GMM Estimators for Dynamic Panel Models By Youssef, Ahmed H.; El-Sheikh, Ahmed A.; Abonazel, Mohamed R.
  12. New GMM Estimators for Dynamic Panel Data Models By Youssef, Ahmed H.; El-Sheikh, Ahmed A.; Abonazel, Mohamed R.

  1. By: Jakob Guldbæk Mikkelsen (Aarhus University and CREATES); Eric Hillebrand (Aarhus University and CREATES); Giovanni Urga (Cass Business School)
    Abstract: In this paper, we develop a maximum likelihood estimator of time-varying loadings in high-dimensional factor models. We specify the loadings to evolve as stationary vector autoregressions (VAR) and show that consistent estimates of the loadings parameters can be obtained by a two-step maximum likelihood estimation procedure. In the first step, principal components are extracted from the data to form factor estimates. In the second step, the parameters of the loadings VARs are estimated as a set of univariate regression models with time-varying coefficients. We document the finite-sample properties of the maximum likelihood estimator through an extensive simulation study and illustrate the empirical relevance of the time-varying loadings structure using a large quarterly dataset for the US economy.
    Keywords: High-dimensional factor models, dynamic factor loadings, maximum likelihood, principal components JEL classification: C33, C55, C13
    Date: 2015–12–15
  2. By: Matei Demetrescu (Christian-Albrechts-University of Kiel); Christoph Hanck (University of Duisburg-Essen); Robinson Kruse (Rijksuniversiteit Groningen and CREATES)
    Abstract: The fixed-b asymptotic framework provides refinements in the use of heteroskedasticity and autocorrelation consistent variance estimators. The resulting limiting distributions of t-statistics are, however, not pivotal when the unconditional variance changes over time. Such time-varying volatility is an important issue for many financial and macroeconomic time series. To regain pivotal fixed-b inference under time-varying volatility, we discuss three alternative approaches. We (i) employ the wild bootstrap (Cavaliere and Taylor, 2008, ET), (ii) resort to time transformations (Cavaliere and Taylor, 2008, JTSA) and (iii) consider to select test statistics and asymptotics according to the outcome of a heteroscedasticity test, since small-b asymptotics deliver standard limiting distributions irrespective of the socalled variance profile of the series. We quantify the degree of size distortions from using the standard fixed-b approach assuming homoskedasticity and compare the effectiveness of the corrections via simulations. It turns out that the wild bootstrap approach is highly recommendable in terms of size and power. An application to testing for equal predictive ability using the Survey of Professional Forecasters illustrates the usefulness of the proposed corrections.
    Keywords: Hypothesis testing, HAC estimation, HAR testing, Bandwidth, Robustness
    JEL: C12 C32
    Date: 2016–01–05
  3. By: John Goddard; Enrico Onali
    Abstract: The properties of statistical tests for hypotheses concerning the parameters of the multifractal model of asset returns (MMAR) are investigated, using Monte Carlo techniques. We show that, in the presence of multifractality, conventional tests of long memory tend to over-reject the null hypothesis of no long memory. Our test addresses this issue by jointly estimating long memory and multifractality. The estimation and test procedures are applied to exchange rate data for 12 currencies. In 11 cases, the exchange rate returns are accurately described by compounding a NIID series with a multifractal time-deformation process. There is no evidence of long memory.
    Date: 2016–01
  4. By: Lucas Lacasa; Ryan Flanagan
    Abstract: The relation between time series irreversibility and entropy production has been recently investigated in thermodynamic systems operating away from equilibrium. In this work we explore this concept in the context of financial time series. We make use of visibility algorithms to quantify in graph-theoretical terms time irreversibility of 35 financial indices evolving over the period 1998-2012. We show that this metric is complementary to standard measures based on volatility and exploit it to both classify periods of financial stress and to rank companies accordingly. We then validate this approach by finding that a projection in principal components space of financial years based on time irreversibility features clusters together periods of financial stress from stable periods. Relations between irreversibility, efficiency and predictability are briefly discussed.
    Date: 2016–01
  5. By: Wieladek, Tomasz (Bank of England)
    Abstract: Interacted panel VAR (IPVAR) models allow coefficients to vary as a deterministic function of observable country characteristics. The varying coefficient Bayesian panel VAR generalises this to the stochastic case. As an application of this framework, I examine if the impact of commodity price shocks on consumption and the CPI varies with the degree of exchange rate, financial, product and labour market liberalisation on data from 1976 Q1–2006 Q4 for 18 OECD countries. The confidence bands are smaller in the deterministic case and as a result most of the characteristics affect the transmission mechanism in a statistically significant way. But only financial liberalisation is an important determinant of commodity price shocks in the stochastic case. This suggests that results from IPVAR models should be interpreted with caution.
    Keywords: Bayesian panel VAR; commodity price shocks
    JEL: C33 E30
    Date: 2016–01–08
  6. By: Kim, Min Seong; Sun, Yixiao; Yang, Jingjing
    Abstract: This paper develops robust testing procedures for nonparametric kernel methods in the presence of temporal dependence of unknown forms. Based on the fixed-bandwidth asymptotic variance and the pre-asymptotic variance, we propose a heteroskedasticity and autocorrelation robust (HAR) variance estimator that achieves double robustness --- it is asymptotically valid regardless of whether the temporal dependence is present or not, and whether the kernel smoothing bandwidth is held constant or allowed to decay with the sample size. Using the HAR variance estimator, we construct the studentized test statistic and examine its asymptotic properties under both the fixed-smoothing and increasing-smoothing asymptotics. The fixed-smoothing approximation and the associated convenient t-approximation achieve extra robustness --- it is asymptotically valid regardless of whether the truncation lag parameter governing the covariance weighting grows at the same rate as or a slower rate than the sample size. Finally, we suggest a simulation-based calibration approach to choose smoothing parameters that optimize testing oriented criteria. Simulation shows that the proposed procedures work very well in finite samples.
    Keywords: Social and Behavioral Sciences, heteroskedasticity and autocorrelation robust variance, calibration, fixed-smoothing asymptotics, fixed-bandwidth asymptotics, kernel density estimator, local polynomial estimator, t-approximation, testing-optimal smoothing-parameters choice, temporal dependence
    Date: 2016–01–04
  7. By: Hwang, Jungbin; Sun, Yixiao
    Abstract: This paper proposes new, simple, and more accurate statistical tests in a cointegrated system that allows for endogenous regressors and serially dependent errors. The approach involves first transforming the time series using some orthonormal basis functions in L²[0,1], which has energy concentrated at low frequencies, and then running an augmented regression based on the transformed data. The tests are extremely simple to implement as they can be carried out in exactly the same way as if the transformed regression is a classical linear normal regression. In particular, critical values are from the standard F or t distribution. The proposed F and t tests are robust in that they are asymptotically valid regardless of whether the number of basis functions is held fixed or allowed to grow with the sample size. The F and t tests have more accurate size in finite samples than existing tests such as the asymptotic chi-squared and normal tests based on the fully-modified OLS estimator of Phillips and Hansen (1990) and the trend IV estimator of Phillips (2014) and can be made as powerful as the latter tests.
    Keywords: Social and Behavioral Sciences, Cointegration, F test, Alternative Asymptotics, Nonparametric Series Method, t test, Transformed and Augmented OLS
    Date: 2016–01–04
  8. By: Shanika L Wickramasuriya; George Athanasopoulos; Rob J Hyndman
    Abstract: Large collections of time series often have aggregation constraints due to product or geographical hierarchies. The forecasts for the disaggregated series are usually required to add up exactly to the forecasts of the aggregated series, a constraint known as “aggregate consistencyâ€. The combination forecasts proposed by Hyndman et al. (2011) are based on a Generalized Least Squares (GLS) estimator and require an estimate of the covariance matrix of the reconciliation errors (i.e., the errors that arise due to aggregate inconsistency). We show that this is impossible to estimate in practice due to identifiability conditions.
    Keywords: Hierarchical time series, forecasting, reconciliation, contemporaneous error correlation, trace minimization
    JEL: C32 C53
    Date: 2015
  9. By: Li, Dong; Ling, Shiqing; Zhu, Ke
    Abstract: This paper proposes a first-order zero-drift GARCH (ZD-GARCH(1, 1)) model to study conditional heteroscedasticity and heteroscedasticity together. Unlike the classical GARCH model, ZD-GARCH(1, 1) model is always non-stationary regardless of the sign of the Lyapunov exponent $\gamma_{0}$ , but interestingly when $\gamma_{0}$ = 0, it is stable with its sample path oscillating randomly between zero and infinity over time. Furthermore, this paper studies the generalized quasi-maximum likelihood estimator (GQMLE) of ZD-GARCH(1, 1) model, and establishes its strong consistency and asymptotic normality. Based on the GQMLE, an estimator for $\gamma_{0}$, a test for stability, and a portmanteau test for model checking are all constructed. Simulation studies are carried out to assess the finite sample performance of the proposed estimators and tests. Applications demonstrate that a stable ZD-GARCH(1, 1) model is more appropriate to capture heteroscedasticity than a non-stationary GARCH(1, 1) model, which suffers from an inconsistent QMLE of the drift term
    Keywords: Conditional heteroscedasticity; GARCH model; Generalized quasi-maximum likelihood estimator; Heteroscedasticity; Portmanteau test; Stability test; Top Lyapunov exponent; Zero-drift GARCH model.
    JEL: C0 C01 C5 C51
    Date: 2016–01–01
  10. By: Youssef, Ahmed; Abonazel, Mohamed R.
    Abstract: This paper considers first-order autoregressive panel model which is a simple model for dynamic panel data (DPD) models. The generalized method of moments (GMM) gives efficient estimators for these models. This efficiency is affected by the choice of the weighting matrix which has been used in GMM estimation. The non-optimal weighting matrices have been used in the conventional GMM estimators. This led to a loss of efficiency. Therefore, we present new GMM estimators based on optimal or suboptimal weighting matrices. Monte Carlo study indicates that the bias and efficiency of the new estimators are more reliable than the conventional estimators.
    Keywords: Dynamic panel data, Generalized method of moments, Kantorovich inequality upper bound, Monte Carlo simulation, Optimal and suboptimal weighting matrices
    JEL: C4 C5 M21
    Date: 2015–09–28
  11. By: Youssef, Ahmed H.; El-Sheikh, Ahmed A.; Abonazel, Mohamed R.
    Abstract: In dynamic panel models, the generalized method of moments (GMM) has been used in many applications since it gives efficient estimators. This efficiency is affected by the choice of the initial weighted matrix. It is common practice to use the inverse of the moment matrix of the instruments as an initial weighted matrix. However, an initial optimal weighted matrix is not known, especially in the system GMM estimation procedure. Therefore, we present the optimal weighted matrix for level GMM estimator, and suboptimal weighted matrices for system GMM estimator, and use these matrices to increase the efficiency of GMM estimator. By using the Kantorovich inequality (KI), we find that the potential efficiency gain becomes large when the variance of individual effects increases compared with the variance of the errors.
    Keywords: dynamic panel data, generalized method of moments, KI upper bound, optimal and suboptimal weighted matrices.
    JEL: C5 C6 C61
    Date: 2014–06–11
  12. By: Youssef, Ahmed H.; El-Sheikh, Ahmed A.; Abonazel, Mohamed R.
    Abstract: In dynamic panel data (DPD) models, the generalized method of moments (GMM) estimation gives efficient estimators. However, this efficiency is affected by the choice of the initial weighting matrix. In practice, the inverse of the moment matrix of the instruments has been used as an initial weighting matrix which led to a loss of efficiency. Therefore, we will present new GMM estimators based on optimal or suboptimal weighting matrices in GMM estimation. Monte Carlo study indicates that the potential efficiency gain by using these matrices. Moreover, the bias and efficiency of the new GMM estimators are more reliable than any other conventional GMM estimators.
    Keywords: Dynamic panel data, Generalized method of moments, Monte Carlo simulation, Optimal and suboptimal weighting matrices.
    JEL: C1 C15 C4 C5 C58
    Date: 2014–10

This nep-ets issue is ©2016 by Yong Yin. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.