|
on Econometric Time Series |
By: | S. Yaser Samadi; Wiranthe B. Herath |
Abstract: | The standard vector autoregressive (VAR) models suffer from overparameterization which is a serious issue for high-dimensional time series data as it restricts the number of variables and lags that can be incorporated into the model. Several statistical methods, such as the reduced-rank model for multivariate (multiple) time series (Velu et al., 1986; Reinsel and Velu, 1998; Reinsel et al., 2022) and the Envelope VAR model (Wang and Ding, 2018), provide solutions for achieving dimension reduction of the parameter space of the VAR model. However, these methods can be inefficient in extracting relevant information from complex data, as they fail to distinguish between relevant and irrelevant information, or they are inefficient in addressing the rank deficiency problem. We put together the idea of envelope models into the reduced-rank VAR model to simultaneously tackle these challenges, and propose a new parsimonious version of the classical VAR model called the reduced-rank envelope VAR (REVAR) model. Our proposed REVAR model incorporates the strengths of both reduced-rank VAR and envelope VAR models and leads to significant gains in efficiency and accuracy. The asymptotic properties of the proposed estimators are established under different error assumptions. Simulation studies and real data analysis are conducted to evaluate and illustrate the proposed method. |
Date: | 2023–09 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2309.12902&r=ets |
By: | Christis Katsouris |
Abstract: | In this article, we study the statistical and asymptotic properties of break-point estimators in nonstationary autoregressive and predictive regression models for testing the presence of a single structural break at an unknown location in the full sample. Moreover, we investigate aspects such as how the persistence properties of covariates and the location of the break-point affects the limiting distribution of the proposed break-point estimators. |
Date: | 2023–08 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2308.13915&r=ets |
By: | Mikihito Nishi |
Abstract: | This study considers tests for coefficient randomness in predictive regressions. Our focus is on how tests for coefficient randomness are influenced by the persistence of random coefficient. We find that when the random coefficient is stationary, or I(0), Nyblom's (1989) LM test loses its optimality (in terms of power), which is established against the alternative of integrated, or I(1), random coefficient. We demonstrate this by constructing tests that are more powerful than the LM test when random coefficient is stationary, although these tests are dominated in terms of power by the LM test when random coefficient is integrated. This implies that the best test for coefficient randomness differs from context to context, and practitioners should take into account the persistence of potentially random coefficient and choose from several tests accordingly. In particular, we show through theoretical and numerical investigations that the product of the LM test and a Wald-type test proposed in this paper is preferable when there is no prior information on the persistence of potentially random coefficient. This point is illustrated by an empirical application using the U.S. stock returns data. |
Date: | 2023–09 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2309.04926&r=ets |
By: | Davide Debortoli; Mario Forni; Luca Gambetti; Luca Sala |
Abstract: | We measure the inflation-unemployment tradeoff associated with monetary easing and tightening, during booms and recessions, using a novel nonlinear Proxy-SVAR approach. We find evidence of significant non-linearities for the U.S. economy (1973:M1 - 2019:M6): stimulating economic activity during recessions is associated with minimal costs in terms of inflation, and reducing inflation during booms delivers small costs in terms of unemployment. Overall, these results provide support for countercyclical monetary policies, in contrast with what predicted by a flat Phillips curve, or previous studies on nonlinear effects of monetary policy. Our results can be rationalized by a simple model with downward nominal wage rigidity, which is also used to assess the validity of our empirical approach. |
Keywords: | monetary policy, inflation-unemployment tradeoff, structural VAR models, proxy- SVAR. |
JEL: | C32 E32 |
Date: | 2023–09 |
URL: | http://d.repec.org/n?u=RePEc:bge:wpaper:1404&r=ets |
By: | Mihnea Constantinescu (National Bank of Ukraine and University of Amsterdam) |
Abstract: | Forecasting economic activity during an invasion is a nontrivial exercise. The lack of timely statistical data and the expected nonlinear effect of military action challenge the use of established nowcasting and shortterm forecasting methodologies. In a recent study (Constantinescu (2023b)), I explore the use of Partial Least Squares (PLS) augmented with an additional variable selection step to nowcast quarterly Ukrainian GDP using Google search data. Model outputs are benchmarked against both static and Dynamic Factor Models. Preliminary results outline the usefulness of PLS in capturing the effects of large shocks in a setting rich in data, but poor in statistics. |
Keywords: | Nowcasting; quarterly GDP; Google Trends; Machine Learning; Partial Least Squares; Sparsity; Markov Blanket |
JEL: | C38 C53 E32 E37 |
Date: | 2023–09–14 |
URL: | http://d.repec.org/n?u=RePEc:gii:giihei:heidwp15-2023&r=ets |
By: | Chaudhuri, Saraswata (Department of Economics, McGill University & Cireq, Montreal); Renault, Eric (Department of Economics, University of Warwick) |
Abstract: | Several modern textbooks report that, thanks to the availability of heteroskedasticity robust standard errors, one observes the near-death of Weighted Least Squares (WLS) in cross-sectional applied work. We argue in this paper that it is actually possible to estimate regression parameters at least as precisely as Ordinary Least Squares (OLS) and WLS, even when using a misspeci ed parametric model for conditional heteroskedasticity. Our analysis is valid for a general regression framework (including Instrumental Variables and Nonlinear Regression) as long as the regression is de ned by a conditional expectation condition. The key is to acknowledge, as first pointed out by Cragg (1992) that, when the user-specific heteroskedasticity model is misspecified, WLS has to be modified depending on a choice of some univariate target for estimation. Moreover, targeted WLS can be improved by properly combining moment equations for OLS and WLS respectively. Efficient GMM must be regularized to take into account the possible multicollinearity of estimating equations when errors terms are actually nearly homoscedastic. |
Keywords: | asymptotic optimality ; misspecification ; nuisance parameters ; weighted least squares JEL Codes: C12 ; C13 ; C21 |
Date: | 2023 |
URL: | http://d.repec.org/n?u=RePEc:wrk:warwec:1473&r=ets |
By: | Saman Banafti (Amazon); Tae-Hwy Lee (Department of Economics, University of California Riverside) |
Abstract: | The Granular Instrumental Variables (GIV) methodology exploits panels with factor error structures to construct instruments to estimate structural time series models with endogeneity even after controlling for latent factors. We extend the GIV methodology in several dimensions. First, we extend the identification procedure to a large N and large T framework, which depends on the asymptotic Herfindahl index of the size distribution of N cross-sectional units. Second, we treat both the factors and loadings as unknown and show that the sampling error in the estimated instrument and factors is negligible when considering the limiting distribution of the structural parameters. Third, we show that the sampling error in the high-dimensional precision matrix is negligible in our estimation algorithm. Fourth, we overidentify the structural parameters with additional constructed instruments, which leads to efficiency gains. Monte Carlo evidence is presented to support our asymptotic theory and application to the global crude oil market leads to new results. |
Keywords: | Interactive effects, Factor error structure, Simultaneity, Power-law tails, Asymptotic Herfindahl index, Global crude oil market, Supply and demand elasticities, Precision matrix. |
JEL: | C26 C36 C38 C46 C55 |
Date: | 2023–09 |
URL: | http://d.repec.org/n?u=RePEc:ucr:wpaper:202308&r=ets |
By: | Martin Lettau |
Abstract: | This paper proposes a new approach to the “factor zoo” conundrum. Instead of applying dimension-reduction methods to a large set of portfolios obtained from sorts on characteristics, I construct factors that summarize the information in characteristics across assets and then sort assets into portfolios according to these “characteristic factors”. I estimate the model on a data set of mutual fund characteristics. Since the data set is 3-dimensional (characteristics of funds over time), characteristic factors are based on a tensor factor model (TFM) that is a generalization of 2-dimensional PCA. I find that parsimonious TFM captures over 90% of the variation in the data set. Pricing factors derived from the TFM have high Sharpe ratios and capture the cross-section of fund returns better than standard benchmark models. |
JEL: | C38 G0 G12 |
Date: | 2023–09 |
URL: | http://d.repec.org/n?u=RePEc:nbr:nberwo:31719&r=ets |
By: | Philippe Andrade; Filippo Ferroni; Leonardo Melosi |
Abstract: | We exploit inequality restrictions on higher-order moments of the distribution of structural shocks to sharpen their identification. We show that these constraints can be treated as necessary conditions and used to shrink the set of admissible rotations. We illustrate the usefulness of this approach showing, by simulations, how it can dramatically improve the identification of monetary policy shocks when combined with widely used sign-restriction schemes. We then apply our methodology to two empirical questions: the effects of monetary policy shocks in the U.S. and the effects of sovereign bond spread shocks in the euro area. In both cases, using higher-moment restrictions significantly sharpens identification. After a shock to euro area government bond spreads, monetary policy quickly turns expansionary, corporate borrowing conditions worsen on impact, the real economy and the labor market of the euro area contract appreciably, and returns on German government bonds fall, likely reflecting investors’ flight to quality. |
Keywords: | shock identification; Skewness; Kurtosis; VAR; Sign restrictions; monetary shocks; Euro area |
JEL: | C32 E27 E32 |
Date: | 2023–08–18 |
URL: | http://d.repec.org/n?u=RePEc:fip:fedhwp:96666&r=ets |
By: | Tae-Hwy Lee (Department of Economics, University of California Riverside); Tao Wang (Department of Economics, University of Victoria, Canada) |
Abstract: | We in this paper utilize P-GMM (Cheng and Liao, 2015) moment selection procedure to select valid and relevant moments for estimating and testing forecast rationality under the flexible loss proposed by Elliott et al. (2005). We motivate the moment selection in a large dimensional setting, explain the fundamental mechanism of P-GMM moment selection procedure, and elucidate how to implement it in the context of forecast rationality by allowing the existence of potentially invalid moment conditions. A set of Monte Carlo simulations is conducted to examine the finite sample performance of P-GMM estimation in integrating the information available in instruments into both the estimation and testing, and a real data analysis using data from the Survey of Professional Forecasters issued by the Federal Reserve Bank of Philadelphia is presented to further illustrate the practical value of the suggested methodology. The results indicate that the P-GMM post-selection estimator of forecaster’s attitude is comparable to the oracle estimator by using the available information efficiently. The accompanying power of rationality and symmetry tests utilizing P-GMM estimation would be substantially increased through reducing the influence of uninformative instruments. When a forecast user estimates and tests for rationality of forecasts that have been produced by others such as Greenbook, P-GMM moment selection procedure can assist in achieving consistent and more efficient outcomes. |
Keywords: | Forecast rationality; Moment selection; P-GMM; Relevance; Validity. |
JEL: | C10 C36 C53 E17 |
Date: | 2023–09 |
URL: | http://d.repec.org/n?u=RePEc:ucr:wpaper:202307&r=ets |
By: | Hao Hao (Ford Motor Company); Tae-Hwy Lee (Department of Economics, University of California Riverside) |
Abstract: | When the endogenous variable is an unknown function of observable instruments, its conditional mean can be approximated using the sieve functions of observable instruments. We propose a novel instrument selection method, Double-criteria Boosting (DB), that consistently selects only valid and relevant instruments from a large set of candidate instruments. Monte Carlo compares GMM using DB with other methods such as GMM using Lasso and shows DB-GMM gives lower bias and RMSE. In the empirical application to automobile demand, the DB-GMM estimator is suggesting a more elastic estimate of the price elasticity of demand than the standard 2SLS estimator. |
Keywords: | Causal inference with high dimensional instruments, Irrelevant instruments, Invalid instruments, Instrument Selection, Machine Learning, Boosting. |
JEL: | C1 C2 C3 C5 |
Date: | 2023–09 |
URL: | http://d.repec.org/n?u=RePEc:ucr:wpaper:202309&r=ets |
By: | Wojciech "Victor" Fulmyk |
Abstract: | I introduce a novel algorithm and accompanying Python library, named mlcausality, designed for the identification of nonlinear Granger causal relationships. This novel algorithm uses a flexible plug-in architecture that enables researchers to employ any nonlinear regressor as the base prediction model. Subsequently, I conduct a comprehensive performance analysis of mlcausality when the prediction regressor is the kernel ridge regressor with the radial basis function kernel. The results demonstrate that mlcausality employing kernel ridge regression achieves competitive AUC scores across a diverse set of simulated data. Furthermore, mlcausality with kernel ridge regression yields more finely calibrated $p$-values in comparison to rival algorithms. This enhancement enables mlcausality to attain superior accuracy scores when using intuitive $p$-value-based thresholding criteria. Finally, mlcausality with the kernel ridge regression exhibits significantly reduced computation times compared to existing nonlinear Granger causality algorithms. In fact, in numerous instances, this innovative approach achieves superior solutions within computational timeframes that are an order of magnitude shorter than those required by competing algorithms. |
Date: | 2023–09 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:2309.05107&r=ets |