nep-ets New Economics Papers
on Econometric Time Series
Issue of 2020‒07‒20
sixteen papers chosen by
Jaqueson K. Galimberti
Auckland University of Technology

  1. Periodic autoregressive conditional duration By Aknouche, Abdelhakim; Almohaimeed, Bader; Dimitrakopoulos, Stefanos
  2. Does the Current State of the Business Cycle matter for Real-Time Forecasting? A Mixed-Frequency Threshold VAR approach. By Heinrich, Markus
  3. Flexible Mixture Priors for Time-varying Parameter Models By Niko Hauzenberger
  4. Hidden Markov Models Applied To Intraday Momentum Trading With Side Information By Hugh Christensen; Simon Godsill; Richard Turner
  5. Forecasting transaction counts with integer-valued GARCH models By Aknouche, Abdelhakim; Almohaimeed, Bader; Dimitrakopoulos, Stefanos
  6. Heteroskedastic Proxy Vector Autoregressions By Helmut Lütkepohl; Thore Schlaak
  7. Nowcasting Unemployment Insurance Claims in the Time of COVID-19 By William D. Larson; Tara M. Sinclair
  8. Hypothesis tests with a repeatedly singular information matrix By Amengual, Dante; Bei, Xinyue; Sentana, Enrique
  9. Biases in Long-Horizon Predictive Regressions By Jacob Boudoukh; Ronen Israel; Matthew P. Richardson
  10. Cointegration in large VARs By Anna Bykhovskaya; Vadim Gorin
  11. Time series copula models using d-vines and v-transforms: an alternative to GARCH modelling By Martin Bladt; Alexander J. McNeil
  12. Estimation of High-Dimensional Dynamic Conditional Precision Matrices with an Application to Forecast Combination By Tae-Hwy Lee; Millie Yi Mao; Aman Ullah
  13. Sparse HP Filter: Finding Kinks in the COVID-19 Contact Rate By Sokbae Lee; Yuan Liao; Myung Hwan Seo; Youngki Shin
  14. Fractional Differencing: (In)stability of Spectral Structure and Risk Measures of Financial Networks By Chakrabarti, Arnab; Chakrabarti, Anindya S.
  15. Does Hamilton’s OLS regression provide a “better alternative†to the Hodrick-Prescott filter? A New Zealand Business Cycle Perspective By Hall, Viv B; Thomson, Peter
  16. Unified Theory for the Large Family of Time Varying Models with Arma Representations: One Solution Fits All. By Karanasos, Menelaos; Paraskevopoulos,Alexandros; Canepa, Alessandra

  1. By: Aknouche, Abdelhakim; Almohaimeed, Bader; Dimitrakopoulos, Stefanos
    Abstract: We propose an autoregressive conditional duration (ACD) model with periodic time-varying parameters and multiplicative error form. We name this model periodic autoregressive conditional duration (PACD). First, we study the stability properties and the moment structures of it. Second, we estimate the model parameters, using (profile and two-stage) Gamma quasi-maximum likelihood estimates (QMLEs), the asymptotic properties of which are examined under general regularity conditions. Our estimation method encompasses the exponential QMLE, as a particular case. The proposed methodology is illustrated with simulated data and two empirical applications on forecasting Bitcoin trading volume and realized volatility. We found that the PACD produces better in-sample and out-of-sample forecasts than the standard ACD.
    Keywords: Positive time series, autoregressive conditional duration, periodic time-varying models, multiplicative error models, exponential QMLE, two-stage Gamma QMLE.
    JEL: C13 C18 C4 C41 C5 C51 C58
    Date: 2020–07–08
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:101696&r=all
  2. By: Heinrich, Markus
    Abstract: Macroeconomic forecasting in recessions is not easy due to the inherent asymmetry of business cycle phases and the increased uncertainty about the future path of the teetering economy. I propose a mixed-frequency threshold vector autoregressive model with common stochastic volatility in mean (MF-T-CSVM-VAR) that enables to condition on the current state of the business cycle and to account for time-varying macroeconomic uncertainty in form of common stochastic volatility in a mixed-frequency setting. A real-time forecasting experiment highlights the advantage of including the threshold feature for the asymmetry as well as the common stochastic volatility in mean in MF-VARs of different size for US GDP, inflation and unemployment. The novel mixed-frequency threshold model delivers better forecasts for short-term point and density forecasts with respect to GDP and unemployment--particularly evident for nowcasts during recessions. In fact, it delivers a better nowcast than the US Survey of Professional Forecasters for the sharp drop in GDP during the Great Recession in 2008Q4.
    Keywords: Threshold VAR,Stochastic Volatility,Forecasting,Mixed-frequency Models,Business Cycle,Bayesian Methods
    JEL: C11 C32 C34 C53 E32
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:zbw:esprep:219312&r=all
  3. By: Niko Hauzenberger
    Abstract: Time-varying parameter (TVP) models often assume that the TVPs evolve according to a random walk. This assumption, however, might be questionable since it implies that coefficients change smoothly and in an unbounded manner. In this paper, we relax this assumption by proposing a flexible law of motion for the TVPs in large-scale vector autoregressions (VARs). Instead of imposing a restrictive random walk evolution of the latent states, we carefully design hierarchical mixture priors on the coefficients in the state equation. These priors effectively allow for discriminating between periods where coefficients evolve according to a random walk and times where the TVPs are better characterized by a stationary stochastic process. Moreover, this approach is capable of introducing dynamic sparsity by pushing small parameter changes towards zero if necessary. The merits of the model are illustrated by means of two applications. Using synthetic data we show that our approach yields precise parameter estimates. When applied to US data, the model reveals interesting patterns of low-frequency dynamics in coefficients and forecasts well relative to a wide range of competing models.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.10088&r=all
  4. By: Hugh Christensen; Simon Godsill; Richard Turner
    Abstract: A Hidden Markov Model for intraday momentum trading is presented which specifies a latent momentum state responsible for generating the observed securities' noisy returns. Existing momentum trading models suffer from time-lagging caused by the delayed frequency response of digital filters. Time-lagging results in a momentum signal of the wrong sign, when the market changes trend direction. A key feature of this state space formulation, is no such lagging occurs, allowing for accurate shifts in signal sign at market change points. The number of latent states in the model is estimated using three techniques, cross validation, penalized likelihood criteria and simulation-based model selection for the marginal likelihood. All three techniques suggest either 2 or 3 hidden states. Model parameters are then found using Baum-Welch and Markov Chain Monte Carlo, whilst assuming a single (discretized) univariate Gaussian distribution for the emission matrix. Often a momentum trader will want to condition their trading signals on additional information. To reflect this, learning is also carried out in the presence of side information. Two sets of side information are considered, namely a ratio of realized volatilities and intraday seasonality. It is shown that splines can be used to capture statistically significant relationships from this information, allowing returns to be predicted. An Input Output Hidden Markov Model is used to incorporate these univariate predictive signals into the transition matrix, presenting a possible solution for dealing with the signal combination problem. Bayesian inference is then carried out to predict the securities $t+1$ return using the forward algorithm. Simple modifications to the current framework allow for a fully non-parametric model with asynchronous prediction.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.08307&r=all
  5. By: Aknouche, Abdelhakim; Almohaimeed, Bader; Dimitrakopoulos, Stefanos
    Abstract: Using numerous transaction data on the number of stock trades, we conduct a forecasting exercise with INGARCH models, governed by various conditional distributions. The model parameters are estimated with efficient Markov Chain Monte Carlo methods, while forecast evaluation is done by calculating point and density forecasts.
    Keywords: Count time series, INGARCH models, MCMC, Forecasting comparison
    JEL: C1 C11 C15 C18 C4 C58
    Date: 2020–07–11
    URL: http://d.repec.org/n?u=RePEc:pra:mprapa:101779&r=all
  6. By: Helmut Lütkepohl; Thore Schlaak
    Abstract: In proxy vector autoregressive models, the structural shocks of interest are identified by an instrument. Although heteroskedasticity is occasionally allowed for, it is typically taken for granted that the impact effects of the structural shocks are time-invariant despite the change in their variances. We develop a test for this implicit assumption and present evidence that the assumption of time-invariant impact effects may be violated in previously used empirical models.
    Keywords: Structural vector autoregression, proxy VAR, identification through heteroskedasticity
    JEL: C32
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:diw:diwwpp:dp1876&r=all
  7. By: William D. Larson (Federal Housing Finance Agency); Tara M. Sinclair (The George Washington University)
    Abstract: Near term forecasts, also called nowcasts, are most challenging but also most important when the economy experiences an abrupt change. In this paper, we explore the performance of models with different information sets and data structures in order to best nowcast US initial unemployment claims in spring of 2020 in the midst of the COVID-19 pandemic. We show that the best model, particularly near the structural break in claims, is a state-level panel model that includes dummy variables to capture the variation in timing of state-ofemergency declarations. Autoregressive models perform poorly at first but catch up relatively quickly. Models including Google Trends are outperformed by alternative models in nearly all periods. Our results suggest that in times of structural change there may be simple approaches to exploit relevant information in the cross sectional dimension to improve forecasts
    Keywords: panel forecasting, time series forecasting, forecast evaluation, structural breaks, Google Trends
    JEL: C53 E24 E27 J64 R23
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:gwc:wpaper:2020-004&r=all
  8. By: Amengual, Dante; Bei, Xinyue; Sentana, Enrique
    Abstract: We study score-type tests in likelihood contexts in which the nullity of the information matrix under the null is larger than one, thereby generalizing earlier results in the literature. Examples include multivariate skew normal distributions, Hermite expansions of Gaussian copulas, purely non-linear predictive regressions, multiplicative seasonal time series models and multivariate regression models with selectivity. Our proposal, which involves higher order derivatives, is asymptotically equivalent to the likelihood ratio but only requires estimation under the null. We conduct extensive Monte Carlo exercises that study the finite sample size and power properties of our proposal and compare it to alternative approaches.
    Keywords: Generalized extremum tests; Higher-order identifiability; Likelihood ratio test; Non-Gaussian copulas; Predictive regressions; Skew normal distributions
    JEL: C12 C22 C34 C46 C58
    Date: 2020–02
    URL: http://d.repec.org/n?u=RePEc:cpr:ceprdp:14415&r=all
  9. By: Jacob Boudoukh; Ronen Israel; Matthew P. Richardson
    Abstract: Analogous to Stambaugh (1999), this paper derives the small sample bias of estimators in J-horizon predictive regressions, providing a plug-in adjustment for these estimators. A number of surprising results emerge, including (i) a higher bias for overlapping than nonoverlapping regressions despite the greater number of observations, and (ii) particularly higher bias for an alternative long-horizon predictive regression commonly advocated for in the literature. For large J, the bias is linear in (J/T) with a slope that depends on the predictive variable’s persistence. The bias adjustment substantially reduces the existing magnitude of long-horizon estimates of predictability.
    JEL: C01 C1 C22 C53 C58 G12 G17
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:27410&r=all
  10. By: Anna Bykhovskaya; Vadim Gorin
    Abstract: The paper analyses cointegration in vector autoregressive processes (VARs) for the cases when both the number of coordinates, $N$, and the number of time periods, $T$, are large and of the same order. We propose a way to examine a VAR for the presence of cointegration based on a modification of the Johansen likelihood ratio test. The advantage of our procedure over the original Johansen test and its finite sample corrections is that our test does not suffer from over-rejection. This is achieved through novel asymptotic theorems for eigenvalues of matrices in the test statistic in the regime of proportionally growing $N$ and $T$. Our theoretical findings are supported by Monte Carlo simulations and an empirical illustration. Moreover, we find a surprising connection with multivariate analysis of variance (MANOVA) and explain why it emerges.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.14179&r=all
  11. By: Martin Bladt; Alexander J. McNeil
    Abstract: An approach to modelling volatile financial return series using d-vine copulas combined with uniformity preserving transformations known as v-transforms is proposed. By generalizing the concept of stochastic inversion of v-transforms, models are obtained that can describe both stochastic volatility in the magnitude of price movements and serial correlation in their directions. In combination with parametric marginal distributions it is shown that these models can rival and sometimes outperform well-known models in the extended GARCH family.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.11088&r=all
  12. By: Tae-Hwy Lee (Department of Economics, University of California Riverside); Millie Yi Mao (Azusa Pacific University); Aman Ullah (University of California, Riverside)
    Abstract: The estimation of a large covariance matrix is challenging when the dimension p is large relative to the sample size n. Common approaches to deal with the challenge have been based on thresholding or shrinkage methods in estimating covariance matrices. However, in many applications (e.g., regression, forecast combination, portfolio selection), what we need is not the covariance matrix but its inverse (the precision matrix). In this paper we introduce a method of estimating the high-dimensional "dynamic conditional precision" (DCP) matrices. The proposed DCP algorithm is based on the estimator of a large unconditional precision matrix by Fan and Lv (2016) to deal with the high-dimension and the dynamic conditional correlation (DCC) model by Engle (2002) to embed a dynamic structure to the conditional precision matrix. The simulation results show that the DCP method performs substantially better than the methods of estimating covariance matrices based on thresholding or shrinkage methods. Finally, inspired by Hsiao and Wan (2014), we examine the "forecast combination puzzle" using the DCP, thresholding, and shrinkage methods.
    Keywords: High-dimensional conditional precision matrix, ISEE, DCP, Forecast combination puzzle.
    JEL: C3 C4 C5
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:ucr:wpaper:202012&r=all
  13. By: Sokbae Lee; Yuan Liao; Myung Hwan Seo; Youngki Shin
    Abstract: In this paper, we estimate the time-varying COVID-19 contact rate of a Susceptible-Infected-Recovered (SIR) model. Our measurement of the contact rate is constructed using data on actively infected, recovered and deceased cases. We propose a new trend filtering method that is a variant of the Hodrick-Prescott (HP) filter, constrained by the number of possible kinks. We term it the \emph{sparse HP filter} and apply it to daily data from five countries: Canada, China, South Korea, the UK and the US. Our new method yields the kinks that are well aligned with actual events in each country. We find that the sparse HP filter provides a fewer kinks than the $\ell_1$ trend filter, while both methods fitting data equally well. Theoretically, we establish risk consistency of both the sparse HP and $\ell_1$ trend filters. Ultimately, we propose to use time-varying $\textit{contact growth rates}$ to document and monitor outbreaks of COVID-19.
    Date: 2020–06
    URL: http://d.repec.org/n?u=RePEc:arx:papers:2006.10555&r=all
  14. By: Chakrabarti, Arnab; Chakrabarti, Anindya S.
    Abstract: Computation of spectral structure and risk measures from networks of multivariate financial time series data has been at the forefront of the statistical finance literature for a long time. A standard mode of analysis is to consider log returns from the equity price data, which is akin to taking first difference ($d = 1$) of the log of the price data. Sometimes authors have considered simple growth rates as well. Either way, the idea is to get rid of the nonstationarity induced by the {\it unit root} of the data generating process. However, it has also been noted in the literature that often the individual time series might have a root which is more or less than unity in magnitude. Thus first differencing leads to under-differencing in many cases and over differencing in others. In this paper, we study how correcting for the order of differencing leads to altered filtering and risk computation on inferred networks. In summary, our results are: (a) the filtering method with extreme information loss like minimum spanning tree as well as filtering with moderate information loss like triangulated maximally filtered graph are very susceptible to such d-corrections, (b) the spectral structure of the correlation matrix is quite stable although the d-corrected market mode almost always dominates the uncorrected (d = 1) market mode indicating under-estimation in the standard analysis, and (c) the PageRank-based risk measure constructed from Granger-causal networks shows an inverted U-shape evolution in the relationship between d-corrected and uncorrected return data over the period of analysis 1972-2018 for historical data of NASDAQ.
    Date: 2020–07–08
    URL: http://d.repec.org/n?u=RePEc:iim:iimawp:14629&r=all
  15. By: Hall, Viv B; Thomson, Peter
    Abstract: Within a New Zealand business cycle context, we assess whether Hamilton’s (H84) OLS regression methodology produces stylised business cycle facts which are materially different from HP1600 measures, and whether using the H84 predictor and other forecast extensions improves the HP filter’s properties at the ends of series. In general, H84 produces exaggerated volatilities and less credible trend movements during key economic periods so there is no material advantage in using H84 de-trending over HP1600. At the ends, the forecast-extended HP filter almost always performs better than the HP filter with no extension which performs slightly better than H84 forecast extension.
    Keywords: Hamilton regression filter, Stylised business cycle facts, New Zealand, End-point issues,
    Date: 2020
    URL: http://d.repec.org/n?u=RePEc:vuw:vuwecf:8956&r=all
  16. By: Karanasos, Menelaos; Paraskevopoulos,Alexandros; Canepa, Alessandra (University of Turin)
    Abstract: For the large family of ARMA models with variable coeffcients we obtain an explicit and computationally tractable solution that generates all their fundamental properties, including theWold-Cramer decomposition and their covariance structure, thus unifying the invertibility conditions which guarantee both their asymptotic stability and main properties. The one sided Green's function, associated with the homogeneous solution, is expressed as a banded Hessenbergian formulated exclusively in terms of the autoregressive parameters of the model. The proposed methodology allows for a unified treatment of these `time varying' systems. We also illustrate mathematically one of the focal points in Hallin's (1986) analysis. Namely, that in a time varying setting the backward asymptotic effciency is different from the forward one. Equally important it is shown how the linear algebra techniques, used to obtain the general solution, are equivalent to a simple procedure for manipulating polynomials with variable coeffcients. The practical significance of the suggested approach is illustrated with an application to U.S. in ation data. The main finding is that in ation persistence increased after 1976, whereas from 1986 onwards the persistence reduces and stabilizes to even lower levels than the pre-1976 period.
    Date: 2020–04
    URL: http://d.repec.org/n?u=RePEc:uto:dipeco:202008&r=all

This nep-ets issue is ©2020 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.