|
on Econometric Time Series |
By: | Chudik, Alexander (Federal Reserve Bank of Dallas); Georgiadis, Georgios (European Central Bank) |
Abstract: | This paper proposes mixed-frequency distributed-lag (MFDL) estimators of impulse response functions (IRFs) in a setup where (i) the shock of interest is observed, (ii) the impact variable of interest is observed at a lower frequency (as a temporally aggregated or sequentially sampled variable), (iii) the data-generating process (DGP) is given by a VAR model at the frequency of the shock, and (iv) the full set of relevant endogenous variables entering the DGP is unknown or unobserved. Consistency and asymptotic normality of the proposed MFDL estimators is established, and their small-sample performance is documented by a set of Monte Carlo experiments. The proposed approach is then applied to estimate the daily pass-through of changes in crude oil prices observed at a daily frequency to U.S. gasoline consumer prices observed at a weekly frequency. We find that the pass-through is fast, with about 28% of the crude oil price changes passed through to retail gasoline prices within five working days, and that the speed of the pass-through has increased over time. |
Keywords: | Mixed frequencies; temporal aggregation; impulse response functions; estimation and inference; VAR models |
JEL: | C22 |
Date: | 2019–03–15 |
URL: | http://d.repec.org/n?u=RePEc:fip:feddgw:356&r=all |
By: | Guo, Gangzheng; Wang, Shaoping; Sun, Yixiao |
Abstract: | This paper considers a moderately explosive autoregressive(1) process with drift where the autoregressive root approaches unity from the right at a certain rate. We first develop a test for the null of moderate explosiveness under independent and identically distributed errors. We show that the t statistic is asymptotically standard normal regardless of whether the errors are Gaussian. This result is in sharp contrast with the existing literature wherein nonstandard limiting distributions are obtained under different model assumptions. When the errors are weakly dependent, we show that the t statistic based on a heteroskedasticity and autocorrelation robust standard error follows Student's t distribution in large samples. Monte Carlo simulations show that our tests have satisfactory size and power performance in finite samples. Applying the asymptotic t test to ten major stock indexes in the pre-2008 financial exuberance period, we find that most indexes are only mildly explosive or not explosive at all, which implies that the bout of the irrational rise was not as serious as previously thought. |
Keywords: | Social and Behavioral Sciences, Heteroskedasticity and Autocorrelation Robust Standard Error, Irrational Exuberance, Local to Unity, Moderate Explosiveness, Student's t Distribution, Unit Root. |
Date: | 2018–07–09 |
URL: | http://d.repec.org/n?u=RePEc:cdl:ucsdec:qt2k26h10n&r=all |
By: | Liu, Cheng; Sun, Yixiao |
Abstract: | We propose an asymptotically valid t test that uses Student's t distribution as the reference distribution in a difference-in-differences regression. For the asymptotic variance estimation, we adopt the clustering-by-time approach to accommodate cross-sectional dependence. This approach often assumes the clusters to be independent across time, but we allow them to be temporally dependent. The proposed t test is based on a special heteroscedasticity and autocorrelation robust (HAR) variance estimator. We target the type I and type II errors and develop a testing-oriented method to select the underlying smoothing parameter. By capturing the estimation uncertainty of the HAR variance estimator, the t test has more accurate size than the corresponding normal test and is just as powerful as the latter. Compared to the nonstandard test developed in the literature, the standard t test is just as accurate but much more convenient to use. Model-based and empirical-data-based Monte Carlo simulations show that the t test works quite well in finite samples. |
Keywords: | Social and Behavioral Sciences, Basis Functions, Difference-in-Differences, Fixed-smoothing Asymptotics, Heteroscedasticity and Autocorrelation Robust, Student's t distribution, t test |
Date: | 2019–03–12 |
URL: | http://d.repec.org/n?u=RePEc:cdl:ucsdec:qt0ck2109g&r=all |
By: | Schlicht, Ekkehart |
Abstract: | This paper describes a moments estimator for a standard state-space model with coefficients generated by a random walk. This estimator does not require that disturbances are normally distributed, but if they are, the proposed estimator is asymptotically equivalent to the maximum likelihood estimator. |
Keywords: | time-series analysis,linear model,state-space estimation,time-varying coefficients,moments estimation |
JEL: | C2 C22 C32 C51 C52 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:zbw:ifwedp:201922&r=all |
By: | Mika Meitz; Pentti Saikkonen |
Abstract: | In this paper we discuss how the notion of subgeometric ergodicity in Markov chain theory can be exploited to study the stability of nonlinear time series models. Subgeometric ergodicity means that the transition probability measures converge to the stationary measure at a rate slower than geometric. Specifically, we consider higher-order nonlinear autoregressions that may exhibit rather arbitrary behavior for moderate values of the observed series and that behave in a near unit root manner for large values of the observed series. Generalizing existing first-order results, we show that these autoregressions are, under appropriate conditions, subgeometrically ergodic. As useful implications we also obtain stationarity and $\beta$-mixing with subgeometrically decaying mixing coefficients. |
Date: | 2019–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1904.07089&r=all |
By: | Anshul Verma; Pierpaolo Vivo; Tiziana Di Matteo |
Abstract: | We propose a new data-driven method to select the optimal number of relevant components in Principal Component Analysis (PCA). This new method applies to correlation matrices whose time autocorrelation function decays more slowly than an exponential, giving rise to long memory effects. In comparison with other available methods present in the literature, our procedure does not rely on subjective evaluations and is computationally inexpensive. The underlying basic idea is to use a suitable factor model to analyse the residual memory after sequentially removing more and more components, and stopping the process when the maximum amount of memory has been accounted for by the retained components. We validate our methodology on both synthetic and real financial data, and find in all cases a clear and computationally superior answer entirely compatible with available heuristic criteria, such as cumulative variance and cross-validation. |
Date: | 2019–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1904.05931&r=all |
By: | Ye, Xiaoqing; Sun, Yixiao |
Abstract: | In this article, we consider time series OLS and IV regressions and introduce a new pair of commands, har and hart, which implement a more accu- rate class of heteroscedasticity and autocorrelation robust (HAR) F and t tests. These tests represent part of the recent progress on HAR inference. The F and t tests are based on the convenient F and t approximations and are more accurate than the conventional chi-squared and normal approximations. The underlying smoothing parameters are selected to target the type I and type II errors, the two fundamental objects in every hypothesis testing problem. The estimation com- mand har and the post-estimation test command hart allow for both kernel HAR variance estimators and orthonormal series HAR variance estimators. In addition, we introduce another pair of new commands, gmmhar and gmmhart which imple- ment the recently developed F and t tests in a two-step GMM framework. For this command we opt for the orthonormal series HAR variance estimator based on the Fourier bases, as it allows us to develop convenient F and t approxima- tions as in the first-step GMM framework. Finally, we present several examples to demonstrate the use of these commands. |
Keywords: | Social and Behavioral Sciences |
Date: | 2018–07–09 |
URL: | http://d.repec.org/n?u=RePEc:cdl:ucsdec:qt0bb8d0s9&r=all |
By: | Hyeongwoo Kim; Kyunghwan Ko |
Abstract: | We present a factor augmented forecasting model for assessing the financial vulnerability in Korea. Dynamic factor models often extract latent common factors from a large panel of time series data via the method of the principal components (PC). Instead, we employ the partial least squares (PLS) method that estimates target specific common factors, utilizing covariances between predictors and the target variable. Applying PLS to 198 monthly frequency macroeconomic time series variables and the Bank of Korea's Financial Stress Index (KFSTI), our PLS factor augmented forecasting models consistently outperformed the random walk benchmark model in out-of-sample prediction exercises in all forecast horizons we considered. Our models also outperformed the autoregressive benchmark model in short-term forecast horizons. We expect our models would provide useful early warning signs of the emergence of systemic risks in Korea's financial markets. |
Keywords: | Partial Least Squares; Principal Component Analysis; Financial Stress Index; Out-of-Sample Forecast; RRMSPE |
JEL: | C38 C53 E44 E47 G01 G17 |
Date: | 2019–04 |
URL: | http://d.repec.org/n?u=RePEc:abn:wpaper:auwp2019-03&r=all |
By: | Th\'eophile Griveau-Billion; Ben Calderhead |
Abstract: | We propose a heterogeneous simultaneous graphical dynamic linear model (H-SGDLM), which extends the standard SGDLM framework to incorporate a heterogeneous autoregressive realised volatility (HAR-RV) model. This novel approach creates a GPU-scalable multivariate volatility estimator, which decomposes multiple time series into economically-meaningful variables to explain the endogenous and exogenous factors driving the underlying variability. This unique decomposition goes beyond the classic one step ahead prediction; indeed, we investigate inferences up to one month into the future using stocks, FX futures and ETF futures, demonstrating its superior performance according to accuracy of large moves, longer-term prediction and consistency over time. |
Date: | 2019–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1904.08153&r=all |
By: | Yoshimasa Uematsu; Takashi Yamagata |
Abstract: | In this paper, we propose a novel consistent estimation method for the approximate factor model of Chamberlain and Rothschild (1983), with large cross-sectional and time-series dimensions (N and T, respectively). Their model assumes that the r (≪N) largest eigenvalues of data covariance matrix grow as N rises without specifying each diverging rate. This is weaker than the typical assumption on the recent factor models, in which all the r largest eigenvalues diverge proportionally to N, and is frequently referred to as the weak factor models. We extend the sparse orthogonal factor regression (SOFAR) proposed by Uematsu et al. (2019) to consider consistent estimation of the weak factors structure, where the k-th largest eigenvalue grows proportionally to N^{α_{k}} with some unknown exponents 0 |
Date: | 2019–04 |
URL: | http://d.repec.org/n?u=RePEc:dpr:wpaper:1053&r=all |
By: | Alain Hecq; Li Sun |
Abstract: | We propose a model selection criterion to detect purely causal from purely noncausal models in the framework of quantile autoregressions (QAR). We also present asymptotics for the i.i.d. case with regularly varying distributed innovations in QAR. This new modelling perspective is appealing for investigating the presence of bubbles in economic and financial time series, and is an alternative to approximate maximum likelihood methods. We illustrate our analysis using hyperinflation episodes in Latin American countries. |
Date: | 2019–04 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1904.05952&r=all |
By: | Quast, Josefine; Wolters, Maik H. |
Abstract: | The authors contribute to the debate regarding the reliability of output gap estimates. As an alternative to the Hodrick-Prescott (HP) filter, they propose a simple modification of the filter proposed by Hamilton in 2018 that shares its favorable real-time properties, but leads to a more even coverage of typical business cycle frequencies. Based on output growth and inflation forecasts and a comparison to revised output gap estimates from policy institutions, they find that real-time output gaps based on the modified Hamilton filter are economically much more meaningful measures of the business cycle than those based on other simple statistical trend-cycle decomposition techniques such as the HP or the Bandpass filter. |
Keywords: | output gap,potential output,trend-cycle decomposition,Hamilton filter,real-time data,inflation forecasting |
JEL: | C18 E32 E37 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:zbw:imfswp:133&r=all |
By: | Roberto Casarin (University Ca' Foscari of Venice); Stefano Grassi (University of Rome `Tor Vergata'); Francesco Ravazzollo (BI Norwegian Business School); Herman K. van Dijk (Erasmus University Rotterdam) |
Abstract: | A flexible forecast density combination approach is introduced that can deal with large data sets. It extends the mixture of experts approach by allowing for model set incompleteness and dynamic learning of combination weights. A dimension reduction step is introduced using a sequential clustering mechanism that allocates the large set of forecast densities into a small number of subsets and the combination weights of the large set of densities are modelled as a dynamic factor model with a number of factors equal to the number of subsets. The forecast density combination is represented as a large finite mixture in nonlinear state space form. An efficient simulation-based Bayesian inferential procedure is proposed using parallel sequential clustering and filtering, implemented on graphics processing units. The approach is applied to track the Standard & Poor 500 index combining more than 7000 forecast densities based on 1856 US individual stocks that are are clustered in a relatively small subset. Substantial forecast and economic gains are obtained, in particular, in the tails using Value-at-Risk. Using a large macroeconomic data set of 142 series, similar forecast gains, including probabilities of recession, are obtained from multivariate forecast density combinations of US real GDP, Inflation, Treasury Bill yield and Employment. Evidence obtained on the dynamic patterns in the financial as well as macroeconomic clusters provide valuable signals useful for improved modelling and more effective economic and financial policies. |
Keywords: | Forecast combinations, Particle filters, Bayesian inference, State Space Models, Sequential Monte Carlo |
JEL: | C11 C14 C15 |
Date: | 2019–04–01 |
URL: | http://d.repec.org/n?u=RePEc:tin:wpaper:20190025&r=all |