nep-ets New Economics Papers
on Econometric Time Series
Issue of 2019‒03‒25
eleven papers chosen by
Jaqueson K. Galimberti
KOF Swiss Economic Institute

  1. Streamlining Time-varying VAR with a Factor Structure in the Parameters By Simon Beyeler
  2. VAR-based Granger-causality test in the presence of instabilities By Yiru Wang; Barbara Rossi
  3. Bayesian MIDAS Penalized Regressions: Estimation, Selection, and Prediction By Matteo Mogliani
  4. Quarterly Forecasting Model for India’s Economic Growth: Bayesian Vector Autoregression Approach By Sen Gupta, Abhijit; Iyer, Tara
  5. The role of information in nonstationary regression By Patrick Marsh
  6. Properties of the power envelope for tests against both stationary and explosive alternatives: the effect of trends By Patrick Marsh
  7. Nonparametric conditional density specification testing and quantile estimation; with application to S&P500 returns By Patrick Marsh
  8. Dynamic discrete mixtures for high frequency prices By Leopoldo Catania; Roberto Di Mari; Paolo Santucci de Magistris
  9. Maximum Likelihood Estimation for the Fractional Vasicek Model By Tanaka, Katsuto; Xiao, Weilin; Yu, Jun
  10. Dynamic Hurst Exponent in Time Series By Carlos Arturo Soto Campos; Leopoldo S\'anchez Cant\'u; Zeus Hern\'andez Veleros
  11. How cluster-robust inference is changing applied econometrics By James G. MacKinnon

  1. By: Simon Beyeler (Swiss National Bank)
    Abstract: I introduce a factor structure on the parameters of a Bayesian TVP-VAR to reduce the dimension of the model's state space. To further limit the scope of over-fitting the estimation of the factor loadings uses a new generation of shrinkage priors. A Monte Carlo study illustrates the ability of the proposed sampler to well distinguish between time-varying and constant parameters. In an application with Swiss data the model proves useful to capture changes in the economy's dynamics due to the lower bound on nominal interest rates.
    Date: 2019–03
  2. By: Yiru Wang; Barbara Rossi
    Abstract: In this article, we review Granger-causality tests robust to the presence of instabilities in a Vector Autoregressive framework. We also introduce the gcrobustvar command, which illustrates the procedure in Stata. In the presence of instabilities, the Granger-causality robust test is more powerful than the traditional Granger-causality test.
    Keywords: gcrobustvar, Granger-causality, VAR, instability, structural breaks, local projections
    Date: 2019–01
  3. By: Matteo Mogliani
    Abstract: We propose a new approach to mixed-frequency regressions in a high-dimensional environment that resorts to Group Lasso penalization and Bayesian techniques for estimation and inference. To improve the sparse recovery ability of the model, we also consider a Group Lasso with a spike-and-slab prior. Penalty hyper-parameters governing the model shrinkage are automatically tuned via an adaptive MCMC algorithm. Simulations show that the proposed models have good selection and forecasting performance, even when the design matrix presents high cross-correlation. When applied to U.S. GDP data, the results suggest that financial variables may have some, although limited, short-term predictive content.
    Date: 2019–03
  4. By: Sen Gupta, Abhijit (Asian Development Bank); Iyer, Tara (Asian Development Bank)
    Abstract: This study develops a framework to forecast India’s gross domestic product growth on a quarterly frequency from 2004 to 2018. The models, which are based on real and monetary sector descriptions of the Indian economy, are estimated using Bayesian vector autoregression (BVAR) techniques. The real sector groups of variables include domestic aggregate demand indicators and foreign variables, while the monetary sector groups specify the underlying inflationary process in terms of the consumer price index (CPI) versus the wholesale price index given India’s recent monetary policy regime switch to CPI inflation targeting. The predictive ability of over 3,000 BVAR models is assessed through a set of forecast evaluation statistics and compared with the forecasting accuracy of alternate econometric models including unrestricted and structural VARs. Key findings include that capital flows to India and CPI inflation have high informational content for India’s GDP growth. The results of this study provide suggestive evidence that quarterly BVAR models of Indian growth have high predictive ability.
    Keywords: Bayesian vector autoregressions; GDP growth; India; time series forecasting
    JEL: C11 C32 C53 F43
    Date: 2019–03–14
  5. By: Patrick Marsh
    Abstract: The role of standard likelihood based measures of information and efficiency is unclear when regressions involve nonstationary data. Typically the standardized score is not asymptotically Gaussian and the standardized Hessian has a stochastic, rather than deterministic limit. Here we consider a time series regression involving a deterministic covariate which can be evaporating, slowly evolving or nonstationary. It is shown that conditional information, or equivalently, profile Kullback-Leibler and Fisher Information remain informative about both the accuracy, i.e. asymptotic variance, of profile maximum likelihood estimators, as well as the power of point optimal invariant tests for a unit root. Specifically these information measures indicate fractional,rather than linear trends may minimize inferential accuracy. Such is confirmed in numerical experiment.
  6. By: Patrick Marsh
    Abstract: This paper details a precise analytic effect that inclusion of a linear trend has on the power of Neyman-Pearson point optimal unit root tests and thence the power envelope. Both stationary and explosive alternatives are considered. The envelope can be characterized by probabilities for two, related, sums of chi-square random variables. A stochastic expansion, in powers of the local-to-unity parameter, of the difference between these loses its leading term when a linear trend is included. This implies that the power envelope converges to size at a faster rate, which can then be exploited to prove that the power envelope must necessarily be lower. This effect is shown to be, analytically, greater asymptotically than in small samples and numerically far greater for explosive than for stationary alternatives. Only a linear trend has a specific rate effect on the power envelope, however other deterministic variables will have some effect. The methods of the paper lead to a simple direct measure of this effect which is then informative about power, in practice.
  7. By: Patrick Marsh
    Abstract: This paper develops a two stage procedure to test for correct dynamic conditional specification. It exploits nonparametric likelihood for an exponential series density estimator applied to the in-sample Probability Integral Transforms obtained from a fitted conditional model. The test is shown to be asymptotically pivotal, without modification. Numerical experiments illustrate both this and also that it can have significantly more power than equivalent tests based on the empirical distribution function, when applied to a number of simple time series specifications. In the event of rejection, the second stage nonparametric estimator can both consistently estimate quantiles of the data, under empirically relevant conditions, as well as correct the predictive log-scores of mis-specified models. Both test and estimator are applied to monthly S&P500 returns data. The estimator leads to narrower predictive confidence bands which also enjoy better coverage and contributes positively to the predictive log-score of Gaussian fitted models. Additional application involves risk evaluation,such as Value at Risk calculations or estimation of the probability of a negative return. The contribution of the nonparametric estimator is particularly clear during the financial crisis of 2007/8 and highlights the usefulness of a specification procedure which offers the possibility of partially correcting rejected specifications.
    Keywords: Conditional specification, series density estimator, nonparametric likelihood ratio, predictive quantiles for returns, log-score.
  8. By: Leopoldo Catania; Roberto Di Mari; Paolo Santucci de Magistris
    Abstract: The tick structure of thef inancial markets entails that price changes observed at very high frequency are discrete. Departing from this empirical evidence we develop a new model to describe the dynamic properties of multivariate time-series of high frequency price changes, including the high probability of observing no variations (price staleness). We assume the existence of two independent latent/hidden Markov processes determining the dynamic properties of the price changes and the excess probability of the occurrence of zeros. We study the probabilistic properties of the model that generates a zero-in ated mixture of Skellam distributions and we develop an EM estimation procedure with closed-form M step. In the empirical application, we study the joint distribution of the price changes of four assets traded on NYSE. Particular focus is dedicated to the precision of the univariate and multivariate density forecasts, to the quality of the predictions of quantities like the volatility and correlations across assets, and to the possibility of disentangling the di erent sources of zero price variation as generated by absence of news, microstructural frictions or by the offsetting positions taken by the traders.
    Keywords: Dynamic Mixtures; Skellam Distribution; Zero-in ated series; EM Algorithm; High frequency prices; Volatility
  9. By: Tanaka, Katsuto (Gakushuin University); Xiao, Weilin (Zhejiang University); Yu, Jun (School of Economics and Lee Kong Chian School of Business, Singapore Management University)
    Abstract: This paper is concerned about the problem of estimating the drift parameters in the fractional Vasicek model from a continuous record of observations. Based on the Girsanov theorem for the fractional Brownian motion, the maximum likelihood (ML) method is used. The asymptotic theory for the ML estimates (MLE) is established in the stationary case, the explosive case, and the null recurrent case for the entire range of the Hurst parameter, providing a complete treatment of asymptotic analysis. It is shown that changing the sign of the persistence parameter will change the asymptotic theory for the MLE, including the rate of convergence and the limiting distribution. It is also found that the asymptotic theory depends on the value of the Hurst parameter.
    Keywords: Maximum likelihood estimate; Fractional Vasicek model; Asymptotic distribution; Stationary process; Explosive process; Null recurrent process
    JEL: C15 C22 C32
    Date: 2019–03–03
  10. By: Carlos Arturo Soto Campos; Leopoldo S\'anchez Cant\'u; Zeus Hern\'andez Veleros
    Abstract: The market efficiency hypothesis has been proposed to explain the behavior of time series of stock markets. The Black-Scholes model (B-S) for example, is based on the assumption that markets are efficient. As a consequence, it is impossible, at least in principle, to "predict" how a market behaves, whatever the circumstances. Recently we have found evidence which shows that it is possible to find self-organized behavior in the prices of assets in financial markets during deep falls of those prices. Through a kurtosis analysis we have identified a critical point that separates time series from stock markets in two different regimes: the mesokurtic segment compatible with a random walk regime and the leptokurtic one that allegedly follows a power law behavior. In this paper we provide some evidence, showing that the Hurst exponent is a good estimator of the regime in which the market is operating. Finally, we propose that the Hurst exponent can be considered as a critical variable in just the same way as magnetization, for example, can be used to distinguish the phase of a magnetic system in physics.
    Date: 2019–03
  11. By: James G. MacKinnon (Queen's University)
    Abstract: In many fields of economics, and also in other disciplines, it is hard to justify the assumption that the random error terms in regression models are uncorrelated. It seems more plausible to assume that they are correlated within clusters, such as geographical areas or time periods, but uncorrelated across clusters. It has therefore become very popular to use "clustered" standard errors, which are robust against arbitrary patterns of within-cluster variation and covariation. Conventional methods for inference using clustered standard errors work very well when the model is correct and the data satisfy certain conditions, but they can produce very misleading results in other cases. This paper discusses some of the issues that users of these methods need to be aware of.
    Keywords: clustered data, cluster-robust variance estimator, CRVE, wild cluster bootstrap, robust inference
    JEL: C15 C21 C23
    Date: 2019–03

This nep-ets issue is ©2019 by Jaqueson K. Galimberti. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.