nep-ets New Economics Papers
on Econometric Time Series
Issue of 2021‒04‒05
nine papers chosen by
Jaqueson K. Galimberti
Auckland University of Technology

  1. Consistent Inference for Predictive Regressions in Persistent Economic Systems By Torben G. Andersen; Rasmus T. Varneskov
  2. SVARs With Occasionally-Binding Constraints By S. Borağan Aruoba; Marko Mlikota; Frank Schorfheide; Sergio Villalvazo
  3. A Powerful Subvector Anderson Rubin Test in Linear Instrumental Variables Regression with Conditional Heteroskedasticity By Patrik Guggenberger; Frank Kleibergen; Sophocles Mavroeidis
  4. Testing for the cointegration rank between Periodically Integrated processes By del Barrio Castro, Tomás
  5. Testing for threshold effects in the TARMA framework By Greta Goracci; Simone Giannerini; Kung-Sik Chan; Howell Tong
  6. Divide-and-Conquer: A Distributed Hierarchical Factor Approach to Modeling Large-Scale Time Series Data By Zhaoxing Gao; Ruey S. Tsay
  7. Local Projections vs. VARs: Lessons From Thousands of DGPs By Dake Li; Mikkel Plagborg-M{\o}ller; Christian K. Wolf
  8. Qualitative versus Quantitative External Information for Proxy Vector Autoregressive Analysis By Lukas Boer; Helmut Lütkepohl
  9. On Random Extended Intervals and their ARMA Processes By Babel Raïssa Guemdjo Kamdem; Jules Sadefo Kamdem; Carlos Ougouyandjou

  1. By: Torben G. Andersen; Rasmus T. Varneskov
    Abstract: We study standard predictive regressions in economic systems governed by persistent vector autoregressive dynamics for the state variables. In particular, all – or a subset – of the variables may be fractionally integrated, which induces a spurious regression problem. We propose a new inference and testing procedure – the Local speCtruM (LCM) approach – for joint significance of the regressors, that is robust against the variables having different integration orders and remains valid regardless of whether predictors are significant and if they induce cointegration. Specifically, the LCM procedure is based on fractional filtering and band-spectrum regression using a suitably selected set of frequency ordinates. Contrary to existing procedures, we establish a uniform Gaussian limit theory and a standard χ2-distributed test statistic. Using LCM inference and testing techniques, we explore predictive regressions for the realized return variation. Standard least squares inference indicates that popular financial and macroeconomic variables convey valuable information about future return volatility. In contrast, we find no significant evidence using our robust LCM procedure. If anything, our tests support a reverse chain of causality: rising financial volatility predates adverse innovations to macroeconomic variables. Simulations illustrate the relevance of the theoretical arguments for finite-sample inference.
    JEL: G12 G17
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:28568&r=all
  2. By: S. Borağan Aruoba; Marko Mlikota; Frank Schorfheide; Sergio Villalvazo
    Abstract: We develop a structural VAR in which an occasionally-binding constraint generates censoring of one of the dependent variables. Once the censoring mechanism is triggered, we allow some of the coefficients for the remaining variables to change. We show that a necessary condition for a unique reduced form is that regression functions for the non-censored variables are continuous at the censoring point and that parameters satisfy some mild restrictions. In our application the censored variable is a nominal interest rate constrained by an effective lower bound (ELB). According to our estimates based on U.S. data, once the ELB becomes binding, the coefficients in the inflation equation change significantly, which translates into a change of the inflation responses to (unconventional) monetary policy and demand shocks. Our results suggest that the presence of the ELB is indeed empirically relevant for the propagation of shocks. We also obtain a shadow interest rate that shows a significant accommodation in the early parts of the Great Recession, followed by a mild and steady accommodation until liftoff in 2016.
    JEL: C11 C22 C34 E32 E52
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:28571&r=all
  3. By: Patrik Guggenberger; Frank Kleibergen; Sophocles Mavroeidis
    Abstract: We introduce a new test for a two-sided hypothesis involving a subset of the structural parameter vector in the linear instrumental variables (IVs) model. Guggenberger et al. (2019), GKM19 from now on, introduce a subvector Anderson-Rubin (AR) test with data-dependent critical values that has asymptotic size equal to nominal size for a parameter space that allows for arbitrary strength or weakness of the IVs and has uniformly nonsmaller power than the projected AR test studied in Guggenberger et al. (2012). However, GKM19 imposes the restrictive assumption of conditional homoskedasticity. The main contribution here is to robustify the procedure in GKM19 to arbitrary forms of conditional heteroskedasticity. We first adapt the method in GKM19 to a setup where a certain covariance matrix has an approximate Kronecker product (AKP) structure which nests conditional homoskedasticity. The new test equals this adaption when the data is consistent with AKP structure as decided by a model selection procedure. Otherwise the test equals the AR/AR test in Andrews (2017) that is fully robust to conditional heteroskedasticity but less powerful than the adapted method. We show theoretically that the new test has asymptotic size bounded by the nominal size and document improved power relative to the AR/AR test in a wide array of Monte Carlo simulations when the covariance matrix is not too far from AKP.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.11371&r=all
  4. By: del Barrio Castro, Tomás
    Abstract: Cointegration between Periodically Integrated (PI) processes has been analyzed among other by Birchen- hall, Bladen-Hovell, Chui, Osborn, and Smith (1989), Boswijk and Franses (1995), Franses and Paap (2004), Kleibergen and Franses (1999) and del Barrio Castro and Osborn (2008). However, so far there is not a method, published in an academic journal, that allows us to determine the cointegration rank between PI processes. This paper fills the gap, a method to determine the cointegration rank between a set PI Processes based on the idea of pseudo-demodulation is proposed in the context of Seasonal Cointegration by del Barrio Castro, Cubadda and Osborn (2020). Once a pseudo-demodulation time series is obtained the Johansen (1995) procedure could be applied to determine the cointegration rank. A Monte Carlo experiment shows that the proposed approach works satisfactorily for small samples.
    Keywords: Reduced Rank Regression,Periodic Cointegration, Periodically Integrated Processes.
    JEL: C32
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:106603&r=all
  5. By: Greta Goracci; Simone Giannerini; Kung-Sik Chan; Howell Tong
    Abstract: We present supremum Lagrange Multiplier tests to compare a linear ARMA specification against its threshold ARMA extension. We derive the asymptotic distribution of the test statistics both under the null hypothesis and contiguous local alternatives. Moreover, we prove the consistency of the tests. The Monte Carlo study shows that the tests enjoy good finite-sample properties, are robust against model mis-specification and their performance is not affected if the order of the model is unknown. The tests present a low computational burden and do not suffer from some of the drawbacks that affect the quasi-likelihood ratio setting. Lastly, we apply our tests to a time series of standardized tree-ring growth indexes and this can lead to new research in climate studies.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.13977&r=all
  6. By: Zhaoxing Gao; Ruey S. Tsay
    Abstract: This paper proposes a hierarchical approximate-factor approach to analyzing high-dimensional, large-scale heterogeneous time series data using distributed computing. The new method employs a multiple-fold dimension reduction procedure using Principal Component Analysis (PCA) and shows great promises for modeling large-scale data that cannot be stored nor analyzed by a single machine. Each computer at the basic level performs a PCA to extract common factors among the time series assigned to it and transfers those factors to one and only one node of the second level. Each 2nd-level computer collects the common factors from its subordinates and performs another PCA to select the 2nd-level common factors. This process is repeated until the central server is reached, which collects common factors from its direct subordinates and performs a final PCA to select the global common factors. The noise terms of the 2nd-level approximate factor model are the unique common factors of the 1st-level clusters. We focus on the case of 2 levels in our theoretical derivations, but the idea can easily be generalized to any finite number of hierarchies. We discuss some clustering methods when the group memberships are unknown and introduce a new diffusion index approach to forecasting. We further extend the analysis to unit-root nonstationary time series. Asymptotic properties of the proposed method are derived for the diverging dimension of the data in each computing unit and the sample size $T$. We use both simulated data and real examples to assess the performance of the proposed method in finite samples, and compare our method with the commonly used ones in the literature concerning the forecastability of extracted factors.
    Date: 2021–03
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2103.14626&r=all
  7. By: Dake Li; Mikkel Plagborg-M{\o}ller; Christian K. Wolf
    Abstract: We conduct a simulation study of Local Projection (LP) and Vector Autoregression (VAR) estimators of structural impulse responses across thousands of data generating processes (DGPs), designed to mimic the properties of the universe of U.S. macroeconomic data. Our analysis considers various structural identification schemes and several variants of LP and VAR estimators, and we pay particular attention to the role of the researcher's loss function. A clear bias-variance trade-off emerges: Because our DGPs are not exactly finite-order VAR models, LPs have lower bias than VAR estimators; however, the variance of LPs is substantially higher than that of VARs at intermediate or long horizons. Unless researchers are overwhelmingly concerned with bias, shrinkage via Bayesian VARs or penalized LPs is attractive.
    Date: 2021–04
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2104.00655&r=all
  8. By: Lukas Boer; Helmut Lütkepohl
    Abstract: A major challenge for proxy vector autoregressive analysis is the construction of a suitable external instrument variable or proxy for identifying a shock of interest. Some authors construct sophisticated proxies that account for the dating and size of the shock while other authors consider simpler versions that use only the dating and signs of particular shocks. It is shown that such qualitative (sign-)proxies can lead to impulse response estimates of the impact effects of the shock of interest that are nearly as efficient as or even more efficient than estimators based on more sophisticated quantitative proxies that also reflect the size of the shock. Moreover, the sign-proxies tend to provide more precise impulse response estimates than an approach based merely on the higher volatility of the shocks of interest on event dates.
    Keywords: GMM, heteroskedastic VAR, instrumental variable estimation, proxy VAR, structural vector autoregression
    JEL: C32
    Date: 2021
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1940&r=all
  9. By: Babel Raïssa Guemdjo Kamdem (IMSP - Institut de Mathématiques et de Sciences Physiques [Bénin] (Université d’Abomey-Calavi (UAC))); Jules Sadefo Kamdem (MRE - Montpellier Recherche en Economie - UM - Université de Montpellier); Carlos Ougouyandjou (IMSP - Institut de Mathématiques et de Sciences Physiques [Bénin] (Université d’Abomey-Calavi (UAC)))
    Abstract: This work introduces and characterizes the so called "random extended intervals", these are random intervals for which the left bound may be higher than the right one. To carry out this study, we introduce on the set of random extended intervals a structure of metric space relevant to study extended interval-valued ARMA time series. This is done by extending the Hausdorff metric on extended intervals and defining a family of metrics dγ relevant for the set of random extended intervals and which do not have some disadvantages of the Hausdorff metric. We show that there exists a unique metric dγ for which γ(t)dt is what we have called "adapted measure" and we use this metric to define variability for random extended intervals.
    Keywords: Uncertainty Modeling,Stochastic Processes,Random Extended Interval,Time series,ARMA,Hausdorff Metrics
    Date: 2020–07
    URL: http://d.repec.org/n?u=RePEc:hal:wpaper:hal-03169516&r=all

This nep-ets issue is ©2021 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.