
on Econometric Time Series 
By:  Simon Beyeler (Swiss National Bank) 
Abstract:  I introduce a factor structure on the parameters of a Bayesian TVPVAR to reduce the dimension of the model's state space. To further limit the scope of overfitting the estimation of the factor loadings uses a new generation of shrinkage priors. A Monte Carlo study illustrates the ability of the proposed sampler to well distinguish between timevarying and constant parameters. In an application with Swiss data the model proves useful to capture changes in the economy's dynamics due to the lower bound on nominal interest rates. 
Date:  2019–03 
URL:  http://d.repec.org/n?u=RePEc:szg:worpap:1903&r=all 
By:  Yiru Wang; Barbara Rossi 
Abstract:  In this article, we review Grangercausality tests robust to the presence of instabilities in a Vector Autoregressive framework. We also introduce the gcrobustvar command, which illustrates the procedure in Stata. In the presence of instabilities, the Grangercausality robust test is more powerful than the traditional Grangercausality test. 
Keywords:  gcrobustvar, Grangercausality, VAR, instability, structural breaks, local projections 
Date:  2019–01 
URL:  http://d.repec.org/n?u=RePEc:upf:upfgen:1642&r=all 
By:  Matteo Mogliani 
Abstract:  We propose a new approach to mixedfrequency regressions in a highdimensional environment that resorts to Group Lasso penalization and Bayesian techniques for estimation and inference. To improve the sparse recovery ability of the model, we also consider a Group Lasso with a spikeandslab prior. Penalty hyperparameters governing the model shrinkage are automatically tuned via an adaptive MCMC algorithm. Simulations show that the proposed models have good selection and forecasting performance, even when the design matrix presents high crosscorrelation. When applied to U.S. GDP data, the results suggest that financial variables may have some, although limited, shortterm predictive content. 
Date:  2019–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1903.08025&r=all 
By:  Sen Gupta, Abhijit (Asian Development Bank); Iyer, Tara (Asian Development Bank) 
Abstract:  This study develops a framework to forecast India’s gross domestic product growth on a quarterly frequency from 2004 to 2018. The models, which are based on real and monetary sector descriptions of the Indian economy, are estimated using Bayesian vector autoregression (BVAR) techniques. The real sector groups of variables include domestic aggregate demand indicators and foreign variables, while the monetary sector groups specify the underlying inflationary process in terms of the consumer price index (CPI) versus the wholesale price index given India’s recent monetary policy regime switch to CPI inflation targeting. The predictive ability of over 3,000 BVAR models is assessed through a set of forecast evaluation statistics and compared with the forecasting accuracy of alternate econometric models including unrestricted and structural VARs. Key findings include that capital flows to India and CPI inflation have high informational content for India’s GDP growth. The results of this study provide suggestive evidence that quarterly BVAR models of Indian growth have high predictive ability. 
Keywords:  Bayesian vector autoregressions; GDP growth; India; time series forecasting 
JEL:  C11 C32 C53 F43 
Date:  2019–03–14 
URL:  http://d.repec.org/n?u=RePEc:ris:adbewp:0573&r=all 
By:  Patrick Marsh 
Abstract:  The role of standard likelihood based measures of information and efficiency is unclear when regressions involve nonstationary data. Typically the standardized score is not asymptotically Gaussian and the standardized Hessian has a stochastic, rather than deterministic limit. Here we consider a time series regression involving a deterministic covariate which can be evaporating, slowly evolving or nonstationary. It is shown that conditional information, or equivalently, profile KullbackLeibler and Fisher Information remain informative about both the accuracy, i.e. asymptotic variance, of profile maximum likelihood estimators, as well as the power of point optimal invariant tests for a unit root. Specifically these information measures indicate fractional,rather than linear trends may minimize inferential accuracy. Such is confirmed in numerical experiment. 
URL:  http://d.repec.org/n?u=RePEc:not:notgts:19/04&r=all 
By:  Patrick Marsh 
Abstract:  This paper details a precise analytic effect that inclusion of a linear trend has on the power of NeymanPearson point optimal unit root tests and thence the power envelope. Both stationary and explosive alternatives are considered. The envelope can be characterized by probabilities for two, related, sums of chisquare random variables. A stochastic expansion, in powers of the localtounity parameter, of the difference between these loses its leading term when a linear trend is included. This implies that the power envelope converges to size at a faster rate, which can then be exploited to prove that the power envelope must necessarily be lower. This effect is shown to be, analytically, greater asymptotically than in small samples and numerically far greater for explosive than for stationary alternatives. Only a linear trend has a specific rate effect on the power envelope, however other deterministic variables will have some effect. The methods of the paper lead to a simple direct measure of this effect which is then informative about power, in practice. 
URL:  http://d.repec.org/n?u=RePEc:not:notgts:19/03&r=all 
By:  Patrick Marsh 
Abstract:  This paper develops a two stage procedure to test for correct dynamic conditional specification. It exploits nonparametric likelihood for an exponential series density estimator applied to the insample Probability Integral Transforms obtained from a fitted conditional model. The test is shown to be asymptotically pivotal, without modification. Numerical experiments illustrate both this and also that it can have significantly more power than equivalent tests based on the empirical distribution function, when applied to a number of simple time series specifications. In the event of rejection, the second stage nonparametric estimator can both consistently estimate quantiles of the data, under empirically relevant conditions, as well as correct the predictive logscores of misspecified models. Both test and estimator are applied to monthly S&P500 returns data. The estimator leads to narrower predictive confidence bands which also enjoy better coverage and contributes positively to the predictive logscore of Gaussian fitted models. Additional application involves risk evaluation,such as Value at Risk calculations or estimation of the probability of a negative return. The contribution of the nonparametric estimator is particularly clear during the financial crisis of 2007/8 and highlights the usefulness of a specification procedure which offers the possibility of partially correcting rejected specifications. 
Keywords:  Conditional specification, series density estimator, nonparametric likelihood ratio, predictive quantiles for returns, logscore. 
URL:  http://d.repec.org/n?u=RePEc:not:notgts:19/02&r=all 
By:  Leopoldo Catania; Roberto Di Mari; Paolo Santucci de Magistris 
Abstract:  The tick structure of thef inancial markets entails that price changes observed at very high frequency are discrete. Departing from this empirical evidence we develop a new model to describe the dynamic properties of multivariate timeseries of high frequency price changes, including the high probability of observing no variations (price staleness). We assume the existence of two independent latent/hidden Markov processes determining the dynamic properties of the price changes and the excess probability of the occurrence of zeros. We study the probabilistic properties of the model that generates a zeroin ated mixture of Skellam distributions and we develop an EM estimation procedure with closedform M step. In the empirical application, we study the joint distribution of the price changes of four assets traded on NYSE. Particular focus is dedicated to the precision of the univariate and multivariate density forecasts, to the quality of the predictions of quantities like the volatility and correlations across assets, and to the possibility of disentangling the di erent sources of zero price variation as generated by absence of news, microstructural frictions or by the offsetting positions taken by the traders. 
Keywords:  Dynamic Mixtures; Skellam Distribution; Zeroin ated series; EM Algorithm; High frequency prices; Volatility 
URL:  http://d.repec.org/n?u=RePEc:not:notgts:19/05&r=all 
By:  Tanaka, Katsuto (Gakushuin University); Xiao, Weilin (Zhejiang University); Yu, Jun (School of Economics and Lee Kong Chian School of Business, Singapore Management University) 
Abstract:  This paper is concerned about the problem of estimating the drift parameters in the fractional Vasicek model from a continuous record of observations. Based on the Girsanov theorem for the fractional Brownian motion, the maximum likelihood (ML) method is used. The asymptotic theory for the ML estimates (MLE) is established in the stationary case, the explosive case, and the null recurrent case for the entire range of the Hurst parameter, providing a complete treatment of asymptotic analysis. It is shown that changing the sign of the persistence parameter will change the asymptotic theory for the MLE, including the rate of convergence and the limiting distribution. It is also found that the asymptotic theory depends on the value of the Hurst parameter. 
Keywords:  Maximum likelihood estimate; Fractional Vasicek model; Asymptotic distribution; Stationary process; Explosive process; Null recurrent process 
JEL:  C15 C22 C32 
Date:  2019–03–03 
URL:  http://d.repec.org/n?u=RePEc:ris:smuesw:2019_008&r=all 
By:  Carlos Arturo Soto Campos; Leopoldo S\'anchez Cant\'u; Zeus Hern\'andez Veleros 
Abstract:  The market efficiency hypothesis has been proposed to explain the behavior of time series of stock markets. The BlackScholes model (BS) for example, is based on the assumption that markets are efficient. As a consequence, it is impossible, at least in principle, to "predict" how a market behaves, whatever the circumstances. Recently we have found evidence which shows that it is possible to find selforganized behavior in the prices of assets in financial markets during deep falls of those prices. Through a kurtosis analysis we have identified a critical point that separates time series from stock markets in two different regimes: the mesokurtic segment compatible with a random walk regime and the leptokurtic one that allegedly follows a power law behavior. In this paper we provide some evidence, showing that the Hurst exponent is a good estimator of the regime in which the market is operating. Finally, we propose that the Hurst exponent can be considered as a critical variable in just the same way as magnetization, for example, can be used to distinguish the phase of a magnetic system in physics. 
Date:  2019–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1903.07809&r=all 
By:  James G. MacKinnon (Queen's University) 
Abstract:  In many fields of economics, and also in other disciplines, it is hard to justify the assumption that the random error terms in regression models are uncorrelated. It seems more plausible to assume that they are correlated within clusters, such as geographical areas or time periods, but uncorrelated across clusters. It has therefore become very popular to use "clustered" standard errors, which are robust against arbitrary patterns of withincluster variation and covariation. Conventional methods for inference using clustered standard errors work very well when the model is correct and the data satisfy certain conditions, but they can produce very misleading results in other cases. This paper discusses some of the issues that users of these methods need to be aware of. 
Keywords:  clustered data, clusterrobust variance estimator, CRVE, wild cluster bootstrap, robust inference 
JEL:  C15 C21 C23 
Date:  2019–03 
URL:  http://d.repec.org/n?u=RePEc:qed:wpaper:1413&r=all 