
on Econometric Time Series 
By:  Tanaka, Katsuto (Gakushuin University); Xiao, Weilin (Zhejiang University); Yu, Jun (School of Economics, Singapore Management University) 
Abstract:  Based on the least squares estimator, this paper proposes a novel method to test the sign of the persistence parameter in a panel fractional OrnsteinUhlenbeck process with a known Hurst parameter H. Depending on H ∈ (1/2, 1), H = 1/2, or H ∈ (0, 1/2), three test statistics are considered. In the null hypothesis the persistence parameter is zero. Based on a panel of continuous record of observations, the null asymptotic distributions are obtained when T is ﬁxed and N is assumed to go to inﬁnity, where T is the time span of the sample and N is the number of cross sections. The power function of the tests is obtained under the local alternative where the persistence parameter is close to zero in the order of 1/(T√N). The local power of the proposed test statistics is computed and compared with that of the maximumlikelihoodbased test. The hypothesis testing problem and the local power function are also considered when a panel of discretesampled observations is available under a sequential limit. 
Keywords:  Panel fractional OrnsteinUhlenbeck process; Least squares; Asymptotic distribution; Local alternative; Local power 
JEL:  C22 C23 
Date:  2020–02–25 
URL:  http://d.repec.org/n?u=RePEc:ris:smuesw:2020_006&r=all 
By:  Sven Otto 
Abstract:  A unit root test is proposed for time series with a general nonlinear deterministic trend component. It is shown that asymptotically the pooled OLS estimator of overlapping blocks filters out any trend component that satisfies some Lipschitz condition. Under both fixed$b$ and small$b$ block asymptotics, the limiting distribution of the tstatistic for the unit root hypothesis is derived. Nuisance parameter corrections provide heteroskedasticityrobust tests, and serial correlation is accounted for by prewhitening. A Monte Carlo study that considers slowly varying trends yields both good size and improved power results for the proposed tests when compared to conventional unit root tests. 
Date:  2020–03 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2003.04066&r=all 
By:  Doko Tchatoka, Firmin; Wang, Wenjie 
Abstract:  Pretesting for exogeneity has become a routine in many empirical applications involving instrumental variables (IVs) to decide whether the ordinary least squares (OLS) or the twostage least squares (2SLS) method is appropriate. Guggenberger (2010) shows that the secondstage ttest– based on the outcome of a Durbin WuHausman type pretest for exogeneity in the firststage– has extreme size distortion with asymptotic size equal to 1 when the standard asymptotic critical values are used. In this paper, we first show that the standard residual bootstrap procedures (with either independent or dependent draws of disturbances) are not viable solutions to such extreme sizedistortion problem. Then, we propose a novel hybrid bootstrap approach, which combines the residualbased bootstrap along with an adjusted Bonferroni sizecorrection method. We establish uniform validity of this hybrid bootstrap in the sense that it yields a twostage test with correct asymptotic size. Monte Carlo simulations confirm our theoretical findings. In particular, our proposed hybrid method achieves remarkable power gains over the 2SLSbased ttest, especially when IVs are not very strong. 
Keywords:  DWH Pretest; Instrumental Variable; Asymptotic Size; Bootstrap; Bonferronibased Sizecorrection; Uniform Inference 
JEL:  C12 C13 C26 
Date:  2020–03–24 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:99243&r=all 
By:  Adrian Pagan; Tim Robinson 
Abstract:  We show that when a model has more shocks than observed variables the estimated filtered and smoothed shocks will be correlated. This is despite no correlation being present in the data generating process. Additionally the estimated shock innovations may be autocorrelated. These correlations limit the relevance of impulse responses, which assume uncorrelated shocks, for interpreting the data. Excess shocks occur frequently, e.g. in UnobservedComponent (UC) models, filters, including Hodrick Prescott (1997), and some Dynamic Stochastic General Equilibrium (DSGE) models. Using several UC models and an estimated DSGE model, Ireland (2011), we demonstrate that sizable correlations among the estimated shocks can result. 
Keywords:  Partial Information, Structural Shocks, Kalman Filter, Measurement Error, DSGE 
JEL:  E37 C51 C52 
Date:  2020–03 
URL:  http://d.repec.org/n?u=RePEc:een:camaaa:202028&r=all 
By:  Dimitris Korobilis (Department of Economics, University of Glasgow, UK; Rimini Centre for Economic Analysis) 
Abstract:  This paper proposes a new Bayesian sampling scheme for VAR inference using sign restrictions. We build on a factor model decomposition of the reducedform VAR disturbances, which are assumed to be driven by a few fundamental factors/shocks. The outcome is a computationally efficient algorithm that allows to jointly sample VAR parameters as well as decompositions of the covariance matrix satisfying desired sign restrictions. Using artificial and real data we show that the new algorithm works well and is multiple times more efficient than existing accept/reject algorithms for sign restrictions. 
Keywords:  highdimensional inference, Structural VAR, Markov chain Monte Carlo, set identification 
JEL:  C11 C13 C15 C22 C52 C53 C61 
Date:  2020–03 
URL:  http://d.repec.org/n?u=RePEc:rim:rimwps:2009&r=all 
By:  Lutz Kilian; Xiaoqing Zhou 
Abstract:  Oil market VAR models have become the standard tool for understanding the evolution of the real price of oil and its impact in the macro economy. As this literature has expanded at a rapid pace, it has become increasingly difficult for mainstream economists to understand the differences between alternative oil market models, let alone the basis for the sometimes divergent conclusions reached in the literature. The purpose of this survey is to provide a guide to this literature. Our focus is on the econometric foundations of the analysis of oil market models with special attention to the identifying assumptions and methods of inference. We not only explain how the workhorse models in this literature have evolved, but also examine alternative oil market VAR models. We help the reader understand why the latter models sometimes generated unconventional, puzzling or erroneous conclusions. Finally, we discuss the construction of extraneous measures of oil demand and oil supply shocks that have been used as external or internal instruments for VAR models. 
Keywords:  Oil supply elasticity; oil demand elasticity; IV estimation; structural VAR 
JEL:  Q43 Q41 C36 C52 
Date:  2020–03–06 
URL:  http://d.repec.org/n?u=RePEc:fip:feddwp:87676&r=all 
By:  Kiss, Tamás (Örebro University School of Business); Österholm, Pär (Örebro University School of Business) 
Abstract:  In this paper, we illustrate the macroeconomic risk associated with the early stage of the coronavirus outbreak. Using monthly data ranging from July 1991 to March 2020 on a recently developed coincidence indicator of global output growth, we estimate an autoregressive model with GARCH effects and nonGaussian disturbances. Our results indicate that i) accounting for conditional heteroscedasticity is important and ii) risk, measured as the volatility of the shocks to the process, is at a very high level – largely on par with that experienced around the financial crisis of 20082009. 
Keywords:  GARCH; NonGaussianity 
JEL:  C22 E32 E37 
Date:  2020–03–23 
URL:  http://d.repec.org/n?u=RePEc:hhs:oruesi:2020_002&r=all 
By:  Lucchetti, Riccardo; Venetis, Ioannis A. 
Abstract:  The authors replicate and extend the Monte Carlo experiment presented in Doz et al. (2012) on alternative (timedomain based) methods for extracting dynamic factors from large datasets; they employ open source software and consider a larger number of replications and a wider set of scenarios. Their narrow sense replication exercise fully confirms the results in the original article. As for their extended replication experiment, the authors examine the relative performance of competing estimators under a wider array of cases, including richer dynamics, and find that maximum likelihood (ML) is often the dominant method; moreover, the persistence characteristics of the observable series play a crucial role and correct specification of the underlying dynamics is of paramount importance. 
Keywords:  dynamic factor models,EM algorithm,Kalman filter,principal components 
JEL:  C15 C32 C55 C87 
Date:  2020 
URL:  http://d.repec.org/n?u=RePEc:zbw:ifwedp:20205&r=all 
By:  Alexander J. McNeil 
Abstract:  An approach to the modelling of financial return series using a class of uniformitypreserving transforms for uniform random variables is proposed. Vtransforms describe the relationship between quantiles of the return distribution and quantiles of the distribution of a predictable volatility proxy variable constructed as a function of the return. Vtransforms can be represented as copulas and permit the construction and estimation of models that combine arbitrary marginal distributions with linear or nonlinear time series models for the dynamics of the volatility proxy. The idea is illustrated using a transformed Gaussian ARMA process for volatility, yielding the class of VTARMA copula models. These can replicate many of the stylized facts of financial return series and facilitate the calculation of marginal and conditional characteristics of the model including quantile measures of risk. Estimation of models is carried out by adapting the exact maximum likelihood approach to the estimation of ARMA processes. 
Date:  2020–02 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2002.10135&r=all 
By:  Yoshimasa Uematsu; Takashi Yamagata 
Abstract:  In this paper, we consider statistical inference for highdimensional approximate factor models. We posit a weak factor structure, in which the factor loading matrix can be sparse and the signal eigenvalues may diverge more slowly than the crosssectional dimension, N. We propose a novel inferential procedure to decide whether each component of the factor loadings is zero or not, and prove that this controls the false discovery rate (FDR) below a preassigned level, while the power tends to unity. This \factor selection" procedure is primarily based on a desparsified (or debiased) version of the WFSOFAR estimator of Uematsu and Yamagata (2020), but is also applicable to the principal component (PC) estimator. After the factor selection, the resparsified WFSOFAR and sparsified PC estimators are proposed and their consistency is established. Finite sample evidence supports the theoretical results. We apply our procedure to the FREDMD macroeconomic and financial data, consisting of 128 series from June 1999 to May 2019. The results strongly suggest the existence of sparse factor loadings and exhibit a clear association of each of the extracted factors with a group of macroeconomic variables. In particular, we find a price factor, housing factor, output and income factor, and a money, credit and stock market factor. 
Date:  2020–03 
URL:  http://d.repec.org/n?u=RePEc:dpr:wpaper:1080&r=all 