nep-ets New Economics Papers
on Econometric Time Series
Issue of 2019‒06‒10
ten papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. Inducing Sparsity and Shrinkage in Time-Varying Parameter Models By Huber, Florian; Koop, Gary; Onorante, Luca
  2. Score-Driven Models for Realized Volatility By Harvey, A.; Palumbo, D.
  3. Co-integration and common trends analysis with score-driven models : an application to US macroeconomic data By Licht, Adrian; Escribano Sáez, Álvaro; Blazsek, Szabolcs Istvan
  4. Forecasting Conditional Covariance Matrices in High-Dimensional Time Series: a General Dynamic Factor Approach By Marc Hallin; Luis K. Hotta; João H. G Mazzeu; Carlos Cesar Trucios-Maza; Pedro L. Valls Pereira; Mauricio Zevallos
  5. Long Memory Conditional Heteroscedasticity in Count Data By Mawuli Segnon; Manuel Stapper
  6. Local Asymptotic Equivalence of the Bai and Ng (2004) and Moon and Perron (2004) Frameworks for Panel Unit Root Testing By Oliver Wichert; I. Gaia Becheri; Feike C. Drost; Ramon van den Akker
  7. On Policy Evaluation with Aggregate Time-Series Shocks By Dmitry Arkhangelsky; Vasily Korovkin
  8. A computational algorithm to analyze unobserved sequential reactions of the central banks: Inference on complex lead-lag relationship in evolution of policy stances By Chakrabarti, Anindya S.; Kumar, Sudarshan
  9. Mildly Explosive Dynamics in U.S. Fixed Income Markets By Contessi, Silvio; De Pace, Pierangelo; Guidolin, Massimo
  10. Efficient Dynamic Yield Curve Estimation in Emerging Financial Markets By Makram El-Shagi; Lunan Jiang

  1. By: Huber, Florian (University of Salzburg); Koop, Gary (University of Strathclyde); Onorante, Luca (European Central Bank)
    Abstract: Time-varying parameter (TVP) models have the potential to be over-parameterized, particularly when the number of variables in the model is large. Global-local priors are increasingly used to induce shrink- age in such models. But the estimates produced by these priors can still have appreciable uncertainty. Sparsification has the potential to remove this uncertainty and improve forecasts. In this paper, we develop computationally simple methods which both shrink and sparsify TVP models. In a simulated data exercise we show the benefits of our shrink-then-sparsify approach in a variety of sparse and dense TVP regressions. In a macroeconomic forecast exercise, we find our approach to substantially improve forecast performance relative to shrinkage alone.
    Keywords: Sparsity; shrinkage; hierarchical priors; time varying parameter regression
    JEL: C11 C30 D31 E30
    Date: 2019–05–26
    URL: http://d.repec.org/n?u=RePEc:ris:sbgwpe:2019_002&r=all
  2. By: Harvey, A.; Palumbo, D.
    Abstract: This paper sets up a statistical framework for modeling realised volatility (RV ) using a Dynamic Conditional Score (DCS) model. It first shows how a preliminary analysis of RV, based on fitting a linear Gaussian model to its logarithm, confirms the presence of long memory effects and suggests a two component dynamic specification. It also indicates a weekly pattern in the data and an analysis of squared residuals suggests the presence of heteroscedasticity. Furthermore working with a Gaussian model in logarithms facilitates a comparison with the popular Heterogeneous Autoregression (HAR), which is a simple way of accounting for long memory in RV. Fitting the two component specification with leverage and a day of the week effect is then carried out directly on RV with a Generalised Beta of the second kind (GB2) conditional distribution. Estimating logRV with an Exponential Generalised Beta of the second kind (EGB2) distribution gives the same result. The EGB2 model is then fitted with heteroscedasticity and its forecasting performance compared with that of HAR. There is a small gain from using the DCS model. However, its main attraction is that it gives a comprehensive description of the properties of the data and yields multi-step forecasts of the conditional distribution of RV.
    Keywords: EGARCH, GB2 distribution, HAR model, heteroscedasticity, long memory, weekly volatility pattern
    Date: 2019–05–30
    URL: http://d.repec.org/n?u=RePEc:cam:camdae:1950&r=all
  3. By: Licht, Adrian; Escribano Sáez, Álvaro; Blazsek, Szabolcs Istvan
    Abstract: We study co-integration and common trends for time series variables, by introducing a new nonlinear multivariate dynamic conditional score (DCS) model that is robust to outliers. The new model is named the t-QVARMA (quasi-vector autoregressive moving average) model, which is a score-driven location model for the multivariate t-distribution. Classical VAR models of co-integrated time series are estimated by using the outlier-sensitive vector error correction model (VECM) representation. In t-QVARMA, the I(0) and I(1) components of the variables are separated in a way that is similar to the Granger-representation of VAR models. We show that a limiting special case of t-QVARMA, named Gaussian-QVARMA, is a Gaussian-VARMA specification with I(0) and I(1) components. For t-QVARMA, we present the reduced-form and the structural-form representations and the impulse response function (IRF). As an application, we study the relationship between federal funds rate and United States (US) inflation rate for the period of July 1954 to January 2019, since those variables are I(1) and co-integrated. We present the outlier-discounting property of t-QVARMA and compare the estimates of the t- QVARMA, Gaussian-QVARMA and Gaussian-VAR alternatives. We find that the statistical performance of t-QVARMA is superior to that of the classical Gaussian-VAR model.
    Keywords: Quasi-Vector Autoregressive Moving Average (Qvarma) Model; Common Trends; Cointegration; Robustness To Outliers; Multivariate Dynamic Conditional Score (Dcs) Models
    Date: 2019–05–19
    URL: http://d.repec.org/n?u=RePEc:cte:werepe:28451&r=all
  4. By: Marc Hallin; Luis K. Hotta; João H. G Mazzeu; Carlos Cesar Trucios-Maza; Pedro L. Valls Pereira; Mauricio Zevallos
    Abstract: Based on a General Dynamic Factor Model with infinite-dimensional factor space, we develop a new estimation and forecasting procedures for conditional covariance matrices in high-dimensional time series. The performance of our approach is evaluated via Monte Carlo experiments, outperforming many alternative methods. The new procedure is used to construct minimum variance portfolios for a high-dimensional panel of assets. The results are shown to achieve better out-of-sample portfolio performance than alternative existing procedures.
    Keywords: Dimension reduction, Large panels, High-dimensional time series, Minimum variance portfolio, Volatility, Multivariate GARCH
    Date: 2019–06
    URL: http://d.repec.org/n?u=RePEc:eca:wpaper:2013/288066&r=all
  5. By: Mawuli Segnon; Manuel Stapper
    Abstract: This paper introduces a new class of integer-valued long memory processes that are adaptations of the well-known FIGARCH(p, d, q) process of Baillie (1996) and HYGARCH(p, d, q) process of Davidson (2004) to a count data setting. We derive the statistical properties of the models and show that reasonable parameter estimates are easily obtained via conditional maximum likelihood estimation. An empirical application with financial transaction data illustrates the practical importance of the models.
    Keywords: Count Data, Poisson Autoregression, Fractionally Integrated, INGARCH
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:cqe:wpaper:8219&r=all
  6. By: Oliver Wichert; I. Gaia Becheri; Feike C. Drost; Ramon van den Akker
    Abstract: This paper considers unit-root tests in large n and large T heterogeneous panels with cross-sectional dependence generated by unobserved factors. We reconsider the two prevalent approaches in the literature, that of Moon and Perron (2004) and the PANIC setup proposed in Bai and Ng (2004). While these have been considered as completely different setups, we show that, in case of Gaussian innovations, the frameworks are asymptotically equivalent in the sense that both experiments are locally asymptotically normal (LAN) with the same central sequence. Using Le Cam's theory of statistical experiments we determine the local asymptotic power envelope and derive an optimal test jointly in both setups. We show that the popular Moon and Perron (2004) and Bai and Ng (2010) tests only attain the power envelope in case there is no heterogeneity in the long-run variance of the idiosyncratic components. The new test is asymptotically uniformly most powerful irrespective of possible heterogeneity. Moreover, it turns out that for any test, satisfying a mild regularity condition, the size and local asymptotic power are the same under both data generating processes. Thus, applied researchers do not need to decide on one of the two frameworks to conduct unit root tests. Monte-Carlo simulations corroborate our asymptotic results and document significant gains in finite-sample power if the variances of the idiosyncratic shocks differ substantially among the cross sectional units.
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.11184&r=all
  7. By: Dmitry Arkhangelsky; Vasily Korovkin
    Abstract: In this paper we construct a parsimonious causal model that addresses multiple issues researchers face when trying to use aggregate time-series shocks for policy evaluation: (a) potential unobserved aggregate confounders, (b) availability of various unit-level characteristics, (c) time and unit-level heterogeneity in treatment effects. We develop a new estimation algorithm that uses insights from treatment effects, panel, and time-series literature. We construct a variance estimator that is robust to arbitrary clustering pattern across geographical units. We achieve this by considering a finite population framework, where potential outcomes are treated as fixed, and all randomness comes from the exogenous shocks. Finally, we illustrate our approach using data from a study on the causal relationship between foreign aid and conflict conducted in Nunn and Qian [2014].
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:arx:papers:1905.13660&r=all
  8. By: Chakrabarti, Anindya S.; Kumar, Sudarshan
    Abstract: Central banks of different countries are some of the largest economic players at the global scale and they are not static in their monetary policy stances. They change their policies substantially over time in response to idiosyncratic or global factors affecting the economies. A very prominent and empirically documented feature arising out of central banks’ actions, is that the relative importance assigned to inflation vis-a-vis output fluctuations evolve substantially over time. We analyze the leading and lagging behavior of central banks of various countries in terms of adopting low inflationary environment vis-a-vis high weight assigned to counteract output fluctuations, in a completely data-driven way. To this end, we propose a new methodology by combining complex Hilbert principle component analysis with state-space models in the form of Kalman filter. The CHPCA mechanism is non-parametric and provides a clean identification of leading and lagging behavior in terms of phase differences of time series in the complex plane. We show that the methodology is useful to characterize the extent of coordination (or lack thereof), of monetary policy stances taken by central banks in a cross-section of developed and developing countries. In particular, the analysis suggests that US Fed led other countries central banks in the pre-crisis period in terms of pursuing low-inflationary regimes.
    Date: 2019–06–03
    URL: http://d.repec.org/n?u=RePEc:iim:iimawp:14608&r=all
  9. By: Contessi, Silvio (Department of Economics, Pomona College); De Pace, Pierangelo (Department of Economics, Pomona College); Guidolin, Massimo (Department of Economics, Pomona College)
    Abstract: We use a recently developed right-tail variation of the Augmented Dickey-Fuller unit root test to identify and date-stamp periods of mildly explosive behavior in the weekly time series of eight U.S. fixed income yield spreads between September 2002 and April 2018. We find statistically significant evidence of mildly explosive dynamics in six of these spreads, two of which are short/medium-term mortgage- related spreads. We show that the time intervals characterized by instability that we estimate from these yield spreads capture known episodes of financial and economic distress in the U.S. economy. Mild explosiveness migrates from short-term funding markets to medium- and long-term markets during the Great Financial Crisis of 2007-09. Furthermore, we statistically validate the conjecture, originally suggested by Gorton (2009a,b), that the initial panic of 2007 migrated from segments of the ABX market to other U.S. fixed income markets in the early phases of the financial crisis.
    Keywords: finance, investment analysis, fixed income markets, yield spreads, mildly explosive behavior
    Date: 2019–02–04
    URL: http://d.repec.org/n?u=RePEc:clm:pomwps:1001&r=all
  10. By: Makram El-Shagi (Center for Financial Development and Stability at Henan University, and School of Economics at Henan University, Kaifeng, Henan); Lunan Jiang (Center for Financial Development and Stability at Henan University, and School of Economics at Henan University, Kaifeng, Henan)
    Abstract: The current state-of-the-art estimation of yield curves relies on the dynamic state space version of the Nelson and Siegel (1987) model proposed in the seminal paper by Diebold et al. (2006). However, things become difficult when applying their approach to emerging economies with less frequently bond issuance and more sparse maturity available. Therefore, the traditional state space representation, which requires dense and fixed grids of maturities, may not be possible. One remedy is to use the traditional Nelson and Siegel (1987) OLS estimation instead, though it sacrifices efficiency by ignoring the time dimension. We propose a simple augmentation of the Diebold et al. (2006) framework, which is more efficient than OLS estimation as it allows exploiting information from all available bonds and the time dependency of yields. We demonstrate the efficiency gains generated by our method in five case studies for major emerging economies including four of the BRICS.
    Keywords: Yield curve, dynamic modeling, state space model, efficiency, BRICS
    JEL: E52 E43
    Date: 2019–05
    URL: http://d.repec.org/n?u=RePEc:fds:dpaper:201904&r=all

This nep-ets issue is ©2019 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.