
on Econometrics 
By:  Christis Katsouris 
Abstract:  We establish the asymptotic validity of the bootstrapbased IVX estimator proposed by Phillips and Magdalinos (2009) for the predictive regression model parameter based on a localtounity specification of the autoregressive coefficient which covers both nearly nonstationary and nearly stationary processes. A mixed Gaussian limit distribution is obtained for the bootstrapbased IVX estimator. The statistical validity of the theoretical results are illustrated by Monte Carlo experiments for various statistical inference problems. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.14463&r=ecm 
By:  Matteo Barigozzi 
Abstract:  We prove that in an approximate factor model for an $n$dimensional vector of stationary time series the factor loadings estimated via Principal Components are asymptotically equivalent, as $n\to\infty$, to those estimated by Quasi Maximum Likelihood. Both estimators are, in turn, also asymptotically equivalent, as $n\to\infty$, to the unfeasible Ordinary Least Squares estimator we would have if the factors were observed. We also show that the usual sandwich form of the asymptotic covariance matrix of the Quasi Maximum Likelihood estimator is asymptotically equivalent to the simpler asymptotic covariance matrix of the unfeasible Ordinary Least Squares. This provides a simple way to estimate asymptotic confidence intervals for the Quasi Maximum Likelihood estimator without the need of estimating the Hessian and Fisher information matrices whose expressions are very complex. All our results hold in the general case in which the idiosyncratic components are crosssectionally heteroskedastic as well as serially and crosssectionally weakly correlated. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.09864&r=ecm 
By:  Jad Beyhum; Jonas Striaukas 
Abstract:  This study introduces a bootstrap test of the validity of factor regression within a highdimensional factoraugmented sparse regression model that integrates factor and sparse regression techniques. The test provides a means to assess the suitability of the classical (dense) factor regression model compared to alternative (sparse plus dense) factoraugmented sparse regression models. Our proposed test does not require tuning parameters, eliminates the need to estimate covariance matrices, and offers simplicity in implementation. The validity of the test is theoretically established under timeseries dependence. Through simulation experiments, we demonstrate the favorable finite sample performance of our procedure. Moreover, using the FREDMD dataset, we apply the test and reject the adequacy of the classical factor regression model when the dependent variable is inflation but not when it is industrial production. These findings offer insights into selecting appropriate models for highdimensional datasets. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.13364&r=ecm 
By:  Masahiro Kato; Akari Ohda; Masaaki Imaizumi; Kenichiro McAlinn 
Abstract:  Synthetic control methods (SCMs) have become a crucial tool for causal inference in comparative case studies. The fundamental idea of SCMs is to estimate counterfactual outcomes for a treated unit by using a weighted sum of observed outcomes from untreated units. The accuracy of the synthetic control (SC) is critical for estimating the causal effect, and hence, the estimation of SC weights has been the focus of much research. In this paper, we first point out that existing SCMs suffer from an implicit endogeneity problem, which is the correlation between the outcomes of untreated units and the error term in the model of a counterfactual outcome. We show that this problem yields a bias in the causal effect estimator. We then propose a novel SCM based on density matching, assuming that the density of outcomes of the treated unit can be approximated by a weighted average of the densities of untreated units (i.e., a mixture model). Based on this assumption, we estimate SC weights by matching moments of treated outcomes and the weighted sum of moments of untreated outcomes. Our proposed method has three advantages over existing methods. First, our estimator is asymptotically unbiased under the assumption of the mixture model. Second, due to the asymptotic unbiasedness, we can reduce the mean squared error for counterfactual prediction. Third, our method generates full densities of the treatment effect, not only expected values, which broadens the applicability of SCMs. We provide experimental results to demonstrate the effectiveness of our proposed method. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.11127&r=ecm 
By:  Christis Katsouris 
Abstract:  We consider Wald type statistics designed for joint predictability and structural break testing based on the instrumentation method of Phillips and Magdalinos (2009). We show that under the assumption of nonstationary predictors: (i) the tests based on the OLS estimators converge to a nonstandard limiting distribution which depends on the nuisance coefficient of persistence; and (ii) the tests based on the IVX estimators can filter out the persistence under certain parameter restrictions due to the supremum functional. These results contribute to the literature of joint predictability and parameter instability testing by providing analytical tractable asymptotic theory when taking into account nonstationary regressors. We compare the finitesample size and power performance of the Wald tests under both estimators via extensive Monte Carlo experiments. Critical values are computed using standard bootstrap inference methodologies. We illustrate the usefulness of the proposed framework to test for predictability under the presence of parameter instability by examining the stock market predictability puzzle for the US equity premium. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.15151&r=ecm 
By:  Yuehao Bai; Jizhou Liu; Azeem M. Shaikh; Max TabordMeehan 
Abstract:  This paper studies the efficient estimation of a large class of treatment effect parameters that arise in the analysis of experiments. Here, efficiency is understood to be with respect to a broad class of treatment assignment schemes for which the marginal probability that any unit is assigned to treatment equals a prespecified value, e.g., one half. Importantly, we do not require that treatment status is assigned in an i.i.d. fashion, thereby accommodating complicated treatment assignment schemes that are used in practice, such as stratified block randomization and matched pairs. The class of parameters considered are those that can be expressed as the solution to a restriction on the expectation of a known function of the observed data, including possibly the prespecified value for the marginal probability of treatment assignment. We show that this class of parameters includes, among other things, average treatment effects, quantile treatment effects, local average treatment effects as well as the counterparts to these quantities in experiments in which the unit is itself a cluster. In this setting, we establish two results. First, we derive a lower bound on the asymptotic variance of estimators of the parameter of interest in the form of a convolution theorem. Second, we show that the n\"aive method of moments estimator achieves this bound on the asymptotic variance quite generally if treatment is assigned using a "finely stratified" design. By a "finely stratified" design, we mean experiments in which units are divided into groups of a fixed size and a proportion within each group is assigned to treatment uniformly at random so that it respects the restriction on the marginal probability of treatment assignment. In this sense, "finely stratified" experiments lead to efficient estimators of treatment effect parameters "by design" rather than through ex post covariate adjustment. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.15181&r=ecm 
By:  Ping Wu (Department of Economics, University of Strathclyde); Gary Koop (Department of Economics, University of Strathclyde) 
Abstract:  Bayesian inference in Vector Autoregressions (VARs) involves manipulating large matrices which appear in the posterior (or conditional posterior) of the VAR coe cients. For large VARs, the computational time involved with these manipulations becomes so large as to make empirical work impractical. In response to this, many researchers transform their VARs so as to allow for Bayesian estimation to proceed one equation at a time. This leads to a massive reduction in the computational bur den. This transformation involves taking the Cholesky decomposition for the error covariance matrix. However, this strategy implies that posterior inference depends on the order the variables enter the VAR. In this paper we develop an alternative transformation, based on the eigendecomposition, which does not lead to order de pendence. Beginning with an inverseWishart prior on the error covariance matrix, we derive and discuss the properties of the prior it implies on the eigenmatrix and eigenvalues. We then show how an extension of the prior on the eigenmatrix can allow for greater exibility while maintaining many of the bene ts of conjugacy. We exploit this exibility in order to extend the prior on the eigenvalues to allow for stochastic volatility. The properties of the eigendecomposition approach are investigated in a macroeconomic forecasting exercise involving VARs with 20 variables. 
Keywords:  Eigendecomposition, order invariance, large vector autoregression 
Date:  2022–11 
URL:  http://d.repec.org/n?u=RePEc:str:wpaper:2310&r=ecm 
By:  Francesco Ruggieri 
Abstract:  I propose a novel argument to point identify economically interpretable intertemporal treatment effects in dynamic regression discontinuity designs (RDDs). Specifically, I develop a dynamic potential outcomes model and specialize two assumptions of the differenceindifferences literature, the no anticipation and common trends restrictions, to point identify cutoffspecific impulse responses. The estimand associated with each target parameter can be expressed as the sum of two static RDD outcome contrasts, thereby allowing for estimation via standard local polynomial tools. I leverage a limited path independence assumption to reduce the dimensionality of the problem. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.14203&r=ecm 
By:  Hugo Kruiniger 
Abstract:  Dovonon and Hall (Journal of Econometrics, 2018) proposed a limiting distribution theory for GMM estimators for a p  dimensional globally identified parameter vector {\phi} when local identification conditions fail at firstorder but hold at secondorder. They assumed that the firstorder underidentification is due to the expected Jacobian having rank p1 at the true value {\phi}_{0}, i.e., having a rank deficiency of one. After reparametrizing the model such that the last column of the Jacobian vanishes, they showed that the GMM estimator of the first p1 parameters converges at rate T^{1/2} and the GMM estimator of the remaining parameter, {\phi}_{p}, converges at rate T^{1/4}. They also provided a limiting distribution of T^{1/4}({\phi}_{p}{\phi}_{0, p}) subject to a (nontransparent) condition which they claimed to be not restrictive in general. However, as we show in this paper, their condition is in fact only satisfied when {\phi} is overidentified and the limiting distribution of T^{1/4}({\phi}_{p}{\phi}_{0, p}), which is nonstandard, depends on whether {\phi} is exactly identified or overidentified. In particular, the limiting distributions of the sign of T^{1/4}({\phi}_{p}{\phi}_{0, p}) for the cases of exact and overidentification, respectively, are different and are obtained by using expansions of the GMM objective function of different orders. Unsurprisingly, we find that the limiting distribution theories of Dovonon and Hall (2018) for Indirect Inference (II) estimation under two different scenarios with secondorder identification where the target function is a GMM estimator of the auxiliary parameter vector, are incomplete for similar reasons. We discuss how our results for GMM estimation can be used to complete both theories and how they can be used to obtain the limiting distributions of the II estimators in the case of exact identification under either scenario. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.13475&r=ecm 
By:  Frank Kleibergen; Lingwei Kong 
Abstract:  We propose identification robust statistics for testing hypotheses on the risk premia in dynamic affine term structure models. We do so using the moment equation specification proposed for these models in Adrian et al. (2013). We extend the subset (factor) AndersonRubin test from Guggenberger et al. (2012) to models with multiple dynamic factors and timevarying risk prices. Unlike projectionbased tests, it provides a computationally tractable manner to conduct identification robust tests on a larger number of parameters. We analyze the potential identification issues arising in empirical studies. Statistical inference based on the threestage estimator from Adrian et al. (2013) requires knowledge of the factors' quality and is misleading without fullrank beta's or with sampling errors of comparable size as the loadings. Empirical applications show that some factors, though potentially weak, may drive the time variation of risk prices, and weak identification issues are more prominent in multifactor models. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.12628&r=ecm 
By:  Yuehao Bai; Hongchang Guo; Azeem M. Shaikh; Max TabordMeehan 
Abstract:  This paper studies inference for the local average treatment effect in randomized controlled trials with imperfect compliance where treatment status is determined according to "matched pairs." By "matched pairs, " we mean that units are sampled i.i.d. from the population of interest, paired according to observed, baseline covariates and finally, within each pair, one unit is selected at random for treatment. Under weak assumptions governing the quality of the pairings, we first derive the limiting behavior of the usual Wald (i.e., twostage least squares) estimator of the local average treatment effect. We show further that the conventional heteroskedasticityrobust estimator of its limiting variance is generally conservative in that its limit in probability is (typically strictly) larger than the limiting variance. We therefore provide an alternative estimator of the limiting variance that is consistent for the desired quantity. Finally, we consider the use of additional observed, baseline covariates not used in pairing units to increase the precision with which we can estimate the local average treatment effect. To this end, we derive the limiting behavior of a twostage least squares estimator of the local average treatment effect which includes both the additional covariates in addition to pair fixed effects, and show that the limiting variance is always less than or equal to that of the Wald estimator. To complete our analysis, we provide a consistent estimator of this limiting variance. A simulation study confirms the practical relevance of our theoretical results. We use our results to revisit a prominent experiment studying the effect of macroinsurance on microenterprise in Egypt. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.13094&r=ecm 
By:  Jad Beyhum; Elia Lapenta; Pascal Lavergne 
Abstract:  We extend nonparametric regression smoothing splines to a context where there is endogeneity and instrumental variables are available. Unlike popular existing estimators, the resulting estimator is onestep and relies on a unique regularization parameter. We derive uniform rates of the convergence for the estimator and its first derivative. We also address the issue of imposing monotonicity in estimation. Simulations confirm the good performances of our estimator compared to twostep procedures. Our method yields economically sensible results when used to estimate Engel curves. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.14867&r=ecm 
By:  Younghoon Kim; Zachary F. Fisher; Vladas Pipiras 
Abstract:  This work considers estimation and forecasting in a multivariate count time series model based on a copulatype transformation of a Gaussian dynamic factor model. The estimation is based on secondorder properties of the count and underlying Gaussian models and applies to the case where the model dimension is larger than the sample length. In addition, novel crossvalidation schemes are suggested for model selection. The forecasting is carried out through a particlebased sequential Monte Carlo, leveraging Kalman filtering techniques. A simulation study and an application are also considered. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.10454&r=ecm 
By:  Deepankar Basu 
Abstract:  I present the YuleFrischWaughLovell theorem for linear instrumental variables estimation of a multiple regression model that is either exactly or overidentified. I show that with linear instrumental variables estimation: (a) coefficients on endogenous variables are identical in full and partial (or residualized) regressions; (b) residual vectors are identical for full and partial regressions; and (c) estimated covariance matrices of the coefficient vectors from full and partial regressions are equal (up to a degree of freedom correction) if the estimator of the error vector is a function only of the residual vectors. While estimation of the full model uses the full set of instrumental variables, estimation of the partial model uses the residualized version of the same set of instrumental variables, with residualization carried out, with respect to the set of exogenous variables. I also trace the historical and analytical development of the theorem and suggest that it be renamed as the YuleFrischWaughLovell (YFWL) theorem to recognize the pioneering contribution of the statistician G. Udny Yule in its development. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.12731&r=ecm 
By:  Nalan Basturk (University of Maastricht); Jamie Cross (Melbourne Business School); Peter de Knijff (Leiden University); Lennart Hoogerheide (Vrije Universiteit Amsterdam); Paul Labonne (BI Norwegian Business School); Herman K van Dijk (Erasmus University Rotterdam) 
Abstract:  Multimodal empirical distributions arise in many fields like Astrophysics, Bioinformatics, Climatology and Economics due to the heterogeneity of the underlying populations. Mixture processes are a popular tool for accurate approximation of such distributions and implied mode detection. Using Bayesian mixture models and methods, BayesMultiMode estimates posterior probabilities of the number of modes, their locations and uncertainty, yielding a powerful tool for mode inference. The approach works in two stages. First, a flexible mixture with an unknown number of components is estimated using a Bayesian MCMC method due to MalsinerWalli, FrühwirthSchnatter, and Grün (2016). Second, suitable detection algorithms are employed to estimate modes for continuous and discrete probability distributions. Given these mode estimates, posterior probabilities for the number of modes, their locations and uncertainties are constructed. BayesMultiMode supports a range of mixture processes, complementing and extending existing software for mixture modeling. The mode detection algorithms implemented in BayesMultiMode also support MCMC draws for mixture estimation generated with external software. The package uses for illustrative purposes both continuous and discrete empirical distributions from the four listed fields yielding credible multiple mode detection with substantial posterior probability where frequentist tests fail to reject the null hypothesis of unimodality. 
Keywords:  multimodality, mixture distributions, Bayesian estimation, sparse finite mixtures, R 
JEL:  C11 C63 C87 C88 
Date:  2023–07–24 
URL:  http://d.repec.org/n?u=RePEc:tin:wpaper:20230041&r=ecm 
By:  Lingwei Kong 
Abstract:  The HansenJagannathan (HJ) distance statistic is one of the most dominant measures of model misspecification. However, the conventional HJ specification test procedure has poor finite sample performance, and we show that it can be size distorted even in large samples when (proxy) factors exhibit small correlations with asset returns. In other words, applied researchers are likely to falsely reject a model even when it is correctly specified. We provide two alternatives for the HJ statistic and two corresponding novel procedures for model specification tests, which are robust against the presence of weak (proxy) factors, and we also offer a novel robust risk premia estimator. Simulation exercises support our theory. Our empirical application documents the nonreliability of the traditional HJ test since it may produce counterintuitive results when comparing nested models by rejecting a fourfactor model but not the reduced threefactor model. At the same time, our proposed methods are practically more appealing and show support for a fourfactor model for Fama French portfolios. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.14499&r=ecm 
By:  Bruno Feunou 
Abstract:  We introduce generalized autoregressive gamma (GARG) processes, a class of autoregressive and movingaverage processes that extends the class of existing autoregressive gamma (ARG) processes in one important dimension: each conditional moment dynamic is driven by a different and identifiable moving average of the variable of interest. The paper provides ergodicity conditions for GARG processes and derives closedform conditional and unconditional moments. The paper also presents estimation and inference methods, illustrated by an application to European option pricing where the daily realized variance follows a GARG dynamic. Our results show that using GARG processes reduces pricing errors by substantially more than using ARG processes does. 
Keywords:  Econometric and statistical methods; Asset pricing 
JEL:  C58 G12 
Date:  2023–08 
URL:  http://d.repec.org/n?u=RePEc:bca:bocawp:2340&r=ecm 
By:  Andrew Bennett; Nathan Kallus; Xiaojie Mao; Whitney Newey; Vasilis Syrgkanis; Masatoshi Uehara 
Abstract:  We consider estimation of parameters defined as linear functionals of solutions to linear inverse problems. Any such parameter admits a doubly robust representation that depends on the solution to a dual linear inverse problem, where the dual solution can be thought as a generalization of the inverse propensity function. We provide the first source condition double robust inference method that ensures asymptotic normality around the parameter of interest as long as either the primal or the dual inverse problem is sufficiently wellposed, without knowledge of which inverse problem is the more wellposed one. Our result is enabled by novel guarantees for iterated Tikhonov regularized adversarial estimators for linear inverse problems, over general hypothesis spaces, which are developments of independent interest. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.13793&r=ecm 
By:  St\'ephane Bonhomme; Kevin Dano 
Abstract:  Economic interactions often occur in networks where heterogeneous agents (such as workers or firms) sort and produce. However, most existing estimation approaches either require the network to be dense, which is at odds with many empirical networks, or they require restricting the form of heterogeneity and the network formation process. We show how the functional differencing approach introduced by Bonhomme (2012) in the context of panel data, can be applied in network settings to derive moment restrictions on model parameters and average effects. Those restrictions are valid irrespective of the form of heterogeneity, and they hold in both dense and sparse networks. We illustrate the analysis with linear and nonlinear models of matched employeremployee data, in the spirit of the model introduced by Abowd, Kramarz, and Margolis (1999). 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.11484&r=ecm 
By:  Romuald Meango 
Abstract:  Can stated preferences help in counterfactual analyses of actual choice? This research proposes a novel approach to researchers who have access to both stated choices in hypothetical scenarios and actual choices. The key idea is to use probabilistic stated choices to identify the distribution of individual unobserved heterogeneity, even in the presence of measurement error. If this unobserved heterogeneity is the source of endogeneity, the researcher can correct for its influence in a demand function estimation using actual choices, and recover causal effects. Estimation is possible with an offtheshelf Group Fixed Effects estimator. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.13966&r=ecm 
By:  Torben G. Andersen; Viktor Todorov; Bo Zhou 
Abstract:  This paper focuses on the task of detecting local episodes involving violation of the standard It\^o semimartingale assumption for financial asset prices in real time that might induce arbitrage opportunities. Our proposed detectors, defined as stopping rules, are applied sequentially to continually incoming highfrequency data. We show that they are asymptotically exponentially distributed in the absence of Ito semimartingale violations. On the other hand, when a violation occurs, we can achieve immediate detection under infill asymptotics. A Monte Carlo study demonstrates that the asymptotic results provide a good approximation to the finitesample behavior of the sequential detectors. An empirical application to S&P 500 index futures data corroborates the effectiveness of our detectors in swiftly identifying the emergence of an extreme return persistence episode in real time. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.10872&r=ecm 
By:  Mr. Sam Ouliaris; Ms. Celine Rochon 
Abstract:  Nowcasting enables policymakers to obtain forecasts of key macroeconomic indicators using higher frequency data, resulting in more timely information to guide proposed policy changes. A significant shortcoming of nowcasting estimators is their “reducedform” nature, which means they cannot be used to assess the impact of policy changes, for example, on the baseline nowcast of real GDP. This paper outlines two separate methodologies to address this problem. The first is a partial equilibrium approach that uses an existing baseline nowcasting regression and singleequation forecasting models for the highfrequency data in that regression. The second approach uses a nonparametric structural VAR estimator recently introduced in Ouliaris and Pagan (2022) that imposes minimal identifying restrictions on the data to estimate the impact of structural shocks. Each approach is illustrated using a countryspecific example. 
Keywords:  Nowcasting; high frequency indicators; impulse responses; structural models 
Date:  2023–07–28 
URL:  http://d.repec.org/n?u=RePEc:imf:imfwpa:2023/153&r=ecm 
By:  Bryan S. Graham; Andrin Pelican 
Abstract:  This paper introduces a simulation algorithm for evaluating the loglikelihood function of a large supermodular binaryaction game. Covered examples include (certain types of) peer effect, technology adoption, strategic network formation, and multimarket entry games. More generally, the algorithm facilitates simulated maximum likelihood (SML) estimation of games with large numbers of players, T, and/or many binary actions per player, M (e.g., games with tens of thousands of strategic actions, TM=O(10⁴)). In such cases the likelihood of the observed pure strategy combination is typically (i) very small and (ii) a TMfold integral who region of integration has a complicated geometry. Direct numerical integration, as well as acceptreject Monte Carlo integration, are computationally impractical in such settings. In contrast, we introduce a novel importance sampling algorithm which allows for accurate likelihood simulation with modest numbers of simulation draws. 
JEL:  C15 C31 C55 C7 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:31511&r=ecm 
By:  Dassios, Angelos; Zhang, Junyi 
Abstract:  Let J1> J2> ⋯ be the ranked jumps of a gamma process τα on the time interval [0 , α] , such that τα=∑k=1∞Jk . In this paper, we design an algorithm that samples from the random vector (J1, ⋯, JN, ∑k=N+1∞Jk) . Our algorithm provides an analog to the wellestablished inverse Lévy measure (ILM) algorithm by replacing the numerical inversion of exponential integral with an acceptancerejection step. This research is motivated by the construction of Dirichlet process prior in Bayesian nonparametric statistics. The prior assigns weight to each atom according to a GEM distribution, and the simulation algorithm enables us to sample from the N largest random weights of the prior. Then we extend the simulation algorithm to a generalised gamma process. The simulation problem of inhomogeneous processes will also be considered. Numerical implementations are provided to illustrate the effectiveness of our algorithms. 
Keywords:  60J25; 62F15; 62G05; exact simulation; gamma process; generalised gamma process; Lévy process; PoissonDirichlet distribution 
JEL:  C1 
Date:  2023–06–10 
URL:  http://d.repec.org/n?u=RePEc:ehl:lserod:119755&r=ecm 
By:  Luca Mucciante; Alessio Sancetta 
Abstract:  A point process for event arrivals in high frequency trading is presented. The intensity is the product of a Hawkes process and high dimensional functions of covariates derived from the order book. Conditions for stationarity of the process are stated. An algorithm is presented to estimate the model even in the presence of billions of data points, possibly mapping covariates into a high dimensional space. The large sample size can be common for high frequency data applications using multiple liquid instruments. Convergence of the algorithm is shown, consistency results under weak conditions is established, and a test statistic to assess out of sample performance of different model specifications is suggested. The methodology is applied to the study of four stocks that trade on the New York Stock Exchange (NYSE). The out of sample testing procedure suggests that capturing the nonlinearity of the order book information adds value to the self exciting nature of high frequency trading events. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.09077&r=ecm 
By:  Songnian Chen; Junlong Feng 
Abstract:  We develop new methods for changesinchanges and distributional synthetic controls when there exists group level heterogeneity. For changesinchanges, we allow individuals to belong to a large number of heterogeneous groups. The new method extends the changesinchanges method in Athey and Imbens (2006) by finding appropriate subgroups within the control groups which share similar group level unobserved characteristics to the treatment groups. For distributional synthetic control, we show that the appropriate synthetic control needs to be constructed using units in potentially different time periods in which they have comparable group level heterogeneity to the treatment group, instead of units that are only in the same time period as in Gunsilius (2023). Implementation and data requirements for these new methods are briefly discussed. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.15313&r=ecm 
By:  Evan Friedman; Duarte Gon\c{c}alves 
Abstract:  Quantal response equilibrium (QRE), a statistical generalization of Nash equilibrium, is a standard benchmark in the analysis of experimental data. Despite its influence, nonparametric characterizations and tests of QRE are unavailable beyond the case of finite games. We address this gap by completely characterizing the set of QRE in a class of binaryaction games with a continuum of types. Our characterization provides sharp predictions in settings such as global games, the volunteer's dilemma, and the compromise game. Further, we leverage our results to develop nonparametric tests of QRE. As an empirical application, we revisit the experimental data from Carrillo and Palfrey (2009) on the compromise game. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.08011&r=ecm 
By:  Bryan S. Graham; Andrin Pelican 
Abstract:  This paper introduces a simulation algorithm for evaluating the loglikelihood function of a large supermodular binaryaction game. Covered examples include (certain types of) peer effect, technology adoption, strategic network formation, and multimarket entry games. More generally, the algorithm facilitates simulated maximum likelihood (SML) estimation of games with large numbers of players, $T$, and/or many binary actions per player, $M$ (e.g., games with tens of thousands of strategic actions, $TM=O(10^4)$). In such cases the likelihood of the observed pure strategy combination is typically (i) very small and (ii) a $TM$fold integral who region of integration has a complicated geometry. Direct numerical integration, as well as acceptreject Monte Carlo integration, are computationally impractical in such settings. In contrast, we introduce a novel importance sampling algorithm which allows for accurate likelihood simulation with modest numbers of simulation draws. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.11857&r=ecm 
By:  Nelson Kyakutwika; Bruce Bartlett 
Abstract:  Crossseries dependencies are crucial in obtaining accurate forecasts when forecasting a multivariate time series. Simultaneous Graphical Dynamic Linear Models (SGDLMs) are Bayesian models that elegantly capture crossseries dependencies. This study forecasts returns of a 40dimensional time series of stock data from the Johannesburg Stock Exchange (JSE) using SGDLMs. The SGDLM approach involves constructing a customised dynamic linear model (DLM) for each univariate time series. At each time point, the DLMs are recoupled using importance sampling and decoupled using meanfield variational Bayes. Our results suggest that SGDLMs forecast stock data on the JSE accurately and respond to market gyrations effectively. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.08665&r=ecm 
By:  Philipp Gersing; Christoph Rust; Manfred Deistler 
Abstract:  Factor Sequences are stochastic double sequences $(y_{it}: i \in \mathbb N, t \in \mathbb Z)$ indexed in time and crosssection which have a so called factor structure. The name was coined by Forni et al. 2001, who introduced dynamic factor sequences. We show the difference between dynamic factor sequences and static factor sequences which are the most common workhorse model of econometric factor analysis building on Chamberlain and Rothschild (1983), Stock and Watson (2002) and Bai and Ng (2002). The difference consists in what we call the weak common component which is spanned by a potentially infinite number of weak factors. Ignoring the weak common component can have substantial consequences for applications of factor models in structural analysis and forecasting. We also show that the dynamic common component of a dynamic factor sequence is causally subordinated to the output under general conditions. As a consequence only the dynamic common component can be interpreted as the projection on the common structural shocks of the economy whereas the static common component models the contemporaneous comovement. 
Date:  2023–07 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:2307.10067&r=ecm 
By:  Kurt Graden Lunsford; Kenneth D. West 
Abstract:  We study the use of a zero mean first difference model to forecast the level of a scalar time series that is stationary in levels. Let bias be the average value of a series of forecast errors. Then the bias of forecasts from a misspecified ARMA model for the first difference of the series will tend to be smaller in magnitude than the bias of forecasts from a correctly specified model for the level of the series. Formally, let P be the number of forecasts. Then the bias from the first difference model has expectation zero and a variance that is O(1/Psquared), while the variance of the bias from the levels model is generally O(1/P). With a driftless random walk as our first difference model, we confirm this theoretical result with simulations and empirical work: random walk bias is generally onetenth to onehalf that of an appropriately specified model fit to levels. 
Keywords:  ARMA Models; Overdifferenced; Prediction; Macroeconomic Time Series; Simulation 
JEL:  C22 C53 E37 E47 
Date:  2023–08–03 
URL:  http://d.repec.org/n?u=RePEc:fip:fedcwq:96521&r=ecm 
By:  Lutz Kilian 
Abstract:  It is common in applied work to estimate responses of macroeconomic aggregates to news shocks derived from surprise changes in daily futures prices around the date of policy announcements. This requires mapping the daily surprises into a monthly shock that may be used as an external instrument in a monthly VAR model or local projection. The standard approach has been to sum these daily surprises over the course of a given month when constructing the monthly proxy variable, ignoring the accounting relationship between daily and average monthly price data. In this paper, I provide a new approach to constructing monthly proxies from daily surprises that takes account of this link and revisit the question of how to use OPEC announcements to identify news shocks in VAR models of the global oil market. The proposed approach calls into question the interpretation of the identified shock as oil supply news and implies quantitatively and qualitatively different estimates of the macroeconomic impact of OPEC announcements. 
Keywords:  Proxy VAR; instrumental variables; shock aggregation; time aggregation; identification; OPEC; supply news; storage demand; oil futures; oil price expectations 
JEL:  C36 C51 E31 E32 E44 Q43 
Date:  2023–07–31 
URL:  http://d.repec.org/n?u=RePEc:fip:feddwp:96517&r=ecm 
By:  Bournakis, Ioannis; Tsionas, Mike G. 
Abstract:  We developed a nonparametric technique to measure Total Factor Productivity (TFP). Our paper has two major novelties in estimating the production function. First, we propose a productivity modelling with both idiosyncratic firm factors and aggregate shocks within the same framework. Second, we apply Bayesian Markov Chain Monte Carlo (MCMC) estimation techniques to overcome restrictions associated with monotonicity between productivity and variable inputs and moment conditions in identifying input parameters. We implemented our methodology in a group of 4286 manufacturing firms from France, Germany, Italy, and the United Kingdom (20012014). The results show that: (i) aggregate shocks matter for firm TFP evolution. The global financial crisis of 2008 caused severe adverse effects on TFP albeit short in duration; (ii) there is substantial heterogeneity across countries in the way firms react to changes in R&D and taxation. German and U.K. firms are more sensitive to fiscal changes than R\&D, while Italian firms are the opposite. R\&D and taxation effects are symmetrical for French firms; (iii) the U.K. productivity handicap continued for years after the financial crisis; (iv) industrial clusters promote knowledge diffusion among German and Italian firms. 
Keywords:  Total Factor Productivity (TFP), Control Function, Nonparametric Bayesian Estimation, Markov Chain Monte Carlo(MCMC), Research and Development (R\&D), Taxation, European firms 
JEL:  C11 D24 H21 H25 Q55 
Date:  2023–07–21 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:118100&r=ecm 
By:  Irsova, Zuzana; Doucouliagos, Hristos; Havranek, Tomas; Stanley, T. D. 
Abstract:  This paper provides concise, nontechnical, stepbystep guidelines on how to conduct a modern metaanalysis, especially in social sciences. We treat publication bias, phacking, and heterogeneity as phenomena metaanalysts must always confront. To this end, we provide concrete methodological recommendations. Metaanalysis methods have advanced notably over the last few years. Yet many metaanalyses still rely on outdated approaches, some ignoring publication bias and systematic heterogeneity. While limitations persist, recently developed techniques allow robust inference even in the face of formidable problems in the underlying empirical literature. The purpose of this paper is to summarize the state of the art in a way accessible to aspiring metaanalysts in any field. We also discuss how metaanalysts can use advances in artificial intelligence to work more efficiently. 
Keywords:  metaanalysis, publication bias, phacking, artificial intelligence, model uncertainty 
JEL:  A14 B49 C83 
Date:  2023 
URL:  http://d.repec.org/n?u=RePEc:zbw:esprep:273719&r=ecm 