|
on Econometrics |
By: | Liang Chen; Yulong Huo |
Abstract: | Canay (2011)'s two-step estimator of quantile panel data models, due to its simple intuition and low computational cost, has been widely used in empirical studies in recent years. In this paper, we revisit the estimator of Canay (2011) and point out that in his asymptotic analysis the bias of his estimator due to the estimation of the fixed effects is mistakenly omitted, and that such omission will lead to invalid inference on the coefficients. To solve this problem, we propose a similar easy-to-implement estimator based on smoothed quantile regressions. The asymptotic distribution of the new estimator is established and the analytical expression of its asymptotic bias is derived. Based on these results, we show how to make asymptotically valid inference based on both analytical and split-panel jackknife bias corrections. Finally, finite sample simulations are used to support our theoretical analysis and to illustrate the importance of bias correction in quantile regressions for panel data. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.04729&r=all |
By: | Sladana Babic; Laetitia Gelbgras; Marc Hallin; Christophe Ley |
Abstract: | Although the assumption of elliptical symmetry is quite common in multivariate analysis and widespread in a number of applications, the problem of testing the null hypothesis of ellipticity so far has not been addressed in a fully satisfactory way. Most of the literature in the area indeed addresses the null hypothesis of elliptical symmetry with specified location and actually addresses location rather than non-elliptical alternatives. In thi spaper, we are proposing new classes of testing procedures,both for specified and unspecified location. The backbone of our construction is Le Cam’s asymptotic theory of statistical experiments, and optimality is to be understood locally and asymptotically within the family of generalized skew-elliptical distributions. The tests we are proposing are meeting all the desired properties of a “good” test of elliptical symmetry:they have a simple asymptotic distribution under the entire null hypothesis of elliptical symmetry with unspecified radial density and shape parameter; they are affine-invariant, computationally fast, intuitively understandable, and not too demanding in terms of moments. While achieving optimality against generalized skew-elliptical alternatives, they remain quite powerful under a much broader class of non-elliptical distributions and significantly outperform the available competitors |
Keywords: | Elliptical Symmetry; Local Asymptotic normality; Maximin tests; Multivariate skewness; semiparametric inference; skew-elliptical densities |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:eca:wpaper:2013/295909&r=all |
By: | Tauchmann, Harald |
Abstract: | This paper shows that popular linear fixed-effects panel-data estimators (first-differences, within-transformation) are biased and inconsistent when applied in a discrete-time hazard setting, that is, one with the outcome variable being a binary dummy indicating an absorbing state, even if the data generating process is fully consistent with the linear discrete-time hazard model. Besides conventional survival bias, these estimators suffer from another source of - potentially severe - bias that originates from the data transformation itself and is present even in the absence of any unobserved heterogeneity. We suggest an alternative, computationally very simple, adjusted first-differences estimator that cures the data-transformation driven bias of the classical estimators. The theoretical line of argument is supported by evidence from Monte Carlo simulations and is illustrated by an empirical application. |
Keywords: | linear probability model,individual fixed effects,short panel,discrete-time hazard,duration analysis,survival analysis,non-repeated event,absorbing state,survival bias,misscaling bias |
JEL: | C23 C25 C41 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:zbw:iwqwdp:092019&r=all |
By: | Fabio Franco (University of Rome "Tor Vergata") |
Abstract: | Particle Markov Chain Monte Carlo (PMCMC) is a widely used method to handle estimation problem in the context of nonlinear structural dynamic models whose likelihood function is analytically intractable. PMCMC can be constructed upon a GMM likelihood representation when one does not want to rely on the structural form of the measurement equation (Gallant et al 2016). It only requires to compute moment conditions available from the structural model. However, particle filter with GMM may suffer from high degeneracy of particle weights which severely affects the accuracy of Monte Carlo approximations and in turn Markov Chain Monte Carlo estimates. This work is concerned with revising particle GMM algorithm as proposed in Gallant et al in order to reduce the depletion problem. Estimation results of stochastic volatility models show that the efficient block sampling strategy as proposed in Doucet et al (2006) can outperform particle GMM and in turn deliver more reliable MCMC estimates. Auxiliary particle filter (Doucet et al, 2011) is also proposed as an alternative strategy to the block sampling approach. However, in the intended experiments it does not seem to be very effective. Thus some of the assumptions needed to estimate structural nonlinear state space models can be weakened and requiring only available moment conditions without affecting dramatically the conclusions. |
Keywords: | Particle filter, Kalman filter, MCMC, Generalized Method of Moments, State Space, nonlinear Structural Dynamic model, Stochastic Volatility |
JEL: | C4 C8 |
Date: | 2019–11–18 |
URL: | http://d.repec.org/n?u=RePEc:rtv:ceisrp:473&r=all |
By: | Richard Y. Chen |
Abstract: | This paper presents the nonparametric inference for nonlinear volatility functionals of general multivariate It\^o semimartingales, in high-frequency and noisy setting. Pre-averaging and truncation enable simultaneous handling of noise and jumps. Second-order expansion reveals explicit biases and a pathway to bias correction. Estimators based on this framework achieve the optimal convergence rate. A class of stable central limit theorems are attained with estimable asymptotic covariance matrices. This paper form a basis for infill asymptotic results of, for example, the realized Laplace transform, the realized principal component analysis, the continuous-time linear regression, and the generalized method of integrated moments, hence helps to extend the application scopes to more frequently sampled noisy data. |
Date: | 2018–10 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1810.04725&r=all |
By: | Taylor, Marshall A. |
Abstract: | Coefficient plots are a popular tool for visualizing regression estimates. The appeal of these plots is that they visualize confidence intervals around the estimates and generally center the plot around zero, meaning that any estimate that crosses zero is statistically non-significant at at least the alpha-level around which the confidence intervals are constructed. For models with statistical significance levels determined via randomization models of inference and for which there is no standard error or confidence intervals for the estimate itself, these plots appear less useful. In this paper, I illustrate a variant of the coefficient plot for regression models with p-values constructed using permutation tests. These visualizations plot each estimate's p-value and its associated confidence interval in relation to a specified alpha-level. These plots can help the analyst interpret and report both the statistical and substantive significance of their models. Illustrations are provided using a nonprobability sample of activists and participants at a 1962 anti-Communism school. |
Date: | 2019–11–08 |
URL: | http://d.repec.org/n?u=RePEc:osf:socarx:bsd7g&r=all |
By: | Emmanuelle Jay (Fidéas Capital, Quanted & Europlace Institute of Finance); Thibault Soler (Fidéas Capital et Centre d'Economie de la Sorbonne); Jean-Philippe Ovarlez (DEMR, ONERA - Université Paris-Saclay); Philippe De Peretti (Centre d'Economie de la Sorbonne - Université Paris 1Panthéon-Sorbonne; https://centredeconomiesorbonne.univ-paris1.fr); Christophe Chorro (Centre d'Economie de la Sorbonne - Université Paris 1 Panthéon-Sorbonne; https://centredeconomiesorbonne.univ-paris1.fr) |
Abstract: | This paper presents how the most recent improvements made on covariance matrix estimation and model order selection can be applied to the portfolio optimization problem. Our study is based on the case of the Maximum Variety Portfolio and may be obviously extended to other classical frameworks with analogous results. We focus on the fact that the assets should preferably be classified in homogeneous groups before applying the proposed methodology which is to whiten the data before estimating the covariance matrix using the robust Tyler M-estimator and the Random Matrix Theory (RMT). The proposed procedure is applied and compared to standard techniques on real market data showing promising improvements |
Keywords: | Robust Covariance Matrix Estimation; Model Order Selection; Random Matrix Theory; Portfolio Optimization; Elliptical Symmetric Noise |
JEL: | C5 G11 |
Date: | 2019–10 |
URL: | http://d.repec.org/n?u=RePEc:mse:cesdoc:19023&r=all |
By: | Giuseppe Brandi; Ruggero Gramatica; Tiziana Di Matteo |
Abstract: | Portfolio allocation and risk management make use of correlation matrices and heavily rely on the choice of a proper correlation matrix to be used. In this regard, one important question is related to the choice of the proper sample period to be used to estimate a stable correlation matrix. This paper addresses this question and proposes a new methodology to estimate the correlation matrix which doesn't depend on the chosen sample period. This new methodology is based on tensor factorization techniques. In particular, combining and normalizing factor components, we build a correlation matrix which shows emerging structural dependency properties not affected by the sample period. To retrieve the factor components, we propose a new tensor decomposition (which we name Slice-Diagonal Tensor (SDT) factorization) and compare it to the two most used tensor decompositions, the Tucker and the PARAFAC. We have that the new factorization is more parsimonious than the Tucker decomposition and more flexible than the PARAFAC. Moreover, this methodology applied to both simulated and empirical data shows results which are robust to two non-parametric tests, namely Kruskal-Wallis and Kolmogorov-Smirnov tests. Since the resulting correlation matrix features stability and emerging structural dependency properties, it can be used as alternative to other correlation matrices type of measures, including the Person correlation. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.06126&r=all |
By: | Ruoxuan Xiong; Susan Athey; Mohsen Bayati; Guido Imbens |
Abstract: | Experimentation has become an increasingly prevalent tool for guiding policy choices, firm decisions, and product innovation. A common hurdle in designing experiments is the lack of statistical power. In this paper, we study optimal multi-period experimental design under the constraint that the treatment cannot be easily removed once implemented; for example, a government or firm might implement treatment in different geographies at different times, where the treatment cannot be easily removed due to practical constraints. The design problem is to select which units to treat at which time, intending to test hypotheses about the effect of the treatment. When the potential outcome is a linear function of a unit effect, a time effect, and observed discrete covariates, we provide an analytically feasible solution to the design problem where the variance of the estimator for the treatment effect is at most 1+O(1/N^2) times the variance of the optimal design, where N is the number of units. This solution assigns units in a staggered treatment adoption pattern, where the proportion treated is a linear function of time. In the general setting where outcomes depend on latent covariates, we show that historical data can be utilized in the optimal design. We propose a data-driven local search algorithm with the minimax decision criterion to assign units to treatment times. We demonstrate that our approach improves upon benchmark experimental designs through synthetic experiments on real-world data sets from several domains, including healthcare, finance, and retail. Finally, we consider the case where the treatment effect changes with the time of treatment, showing that the optimal design treats a smaller fraction of units at the beginning and a greater share at the end. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.03764&r=all |
By: | Yixiao Sun; Xuexin Wang |
Abstract: | This study proposes a simple, trustworthy Chow test in the presence of heteroscedasticity and autocorrelation. The test is based on a series heteroscedasticity and autocorrelation robust variance estimator with judiciously crafted basis functions. Like the Chow test in a classical normal linear regression, the proposed test employs the standard F distribution as the reference distribution, which is justified under fixed-smoothing asymptotics. Monte Carlo simulations show that the null rejection probability of the asymptotic F test is closer to the nominal level than that of the chi-square test. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.03771&r=all |
By: | Cristina Gualdani; Shruti Sinha |
Abstract: | In this paper we study identification and inference of preference parameters in a single-agent, static, discrete choice model where the decision maker may face attentional limits precluding her to exhaustively process information about the payoffs of the available alternatives. By leveraging on the notion of one-player Bayesian Correlated Equilibrium in Bergemann and Morris (2016), we provide a tractable characterisation of the sharp identified set and discuss inference under minimal assumptions on the amount of information processed by the decision maker and under no assumptions on the rule with which the decision maker resolves ties. Simulations reveal that the obtained bounds on the preference parameters can be tight in several settings of empirical interest. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.04529&r=all |
By: | Mirela Miescu; Haroon Mumtaz |
Abstract: | We show that the contemporaneous and longer horizon impulse responses estimated using small-scale Proxy structural vector autoregressions (SVARs) can be severely biased in the presence of information insufficiency. Instead, we recommend the use of a Proxy Factor Augmented VAR (FAVAR) model that remains robust in the presence of this problem. In an empirical exercise, we demonstrate that this issue has important consequences for the estimated impact of monetary policy shocks in the US. We find that the impulse responses of real activity and prices estimated using a Proxy FAVAR are substantially larger and more persistent than those suggested by a small-scale Proxy SVAR. |
Keywords: | information sufficiency, dynamic factor models, instrumental variables, monetary policy, structural VAR |
JEL: | C36 C38 E52 |
Date: | 2019 |
URL: | http://d.repec.org/n?u=RePEc:lan:wpaper:280730188&r=all |
By: | Brendan K. Beare; Juwon Seo |
Abstract: | New nonparametric tests of copula exchangeability and radial symmetry are proposed. The novel aspect of the tests is a resampling procedure that exploits group invariance conditions associated with the relevant symmetry hypothesis. They may be viewed as feasible versions of randomization tests of symmetry, the latter being inapplicable due to the unobservability of margins. Our tests are simple to compute, control size asymptotically, consistently detect arbitrary forms of asymmetry, and do not require the specification of a tuning parameter. Simulations indicate excellent small sample properties compared to existing procedures involving the multiplier bootstrap. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.05307&r=all |
By: | Martin Bruns (University of East Anglia) |
Abstract: | Structural VAR models require two ingredients: (i) Informational sufficiency, and (ii) a valid identification strategy. These conditions are unlikely to be met by small-scale recursively identified VAR models. I propose a Bayesian Proxy Factor-Augmented VAR (BP-FAVAR) to combine a large information set with an identification scheme based on an external instrument. In an application to monetary policy shocks I find that augmenting a standard small-scale Proxy VAR by factors from a large set of financial variables changes the model dynamics and delivers price responses which are more in line with economic theory. A second application shows that an exogenous increase in uncertainty affects disaggregated investment series more negatively than consumption series. |
Keywords: | Dynamic factor models, external instruments, monetary policy, uncertainty shocks |
JEL: | C38 E60 |
Date: | 2019–08–16 |
URL: | http://d.repec.org/n?u=RePEc:uea:ueaeco:2019_03&r=all |
By: | Zeng-Hua Lu |
Abstract: | Much empirical research in economics and finance involves simultaneously testing multiple hypotheses. This paper proposes extended MinP (EMinP) tests by expanding the minimand set of the MinP test statistic to include the $p$% -value of a global test such as a likelihood ratio test. We show that, compared with MinP tests, EMinP tests may considerably improve the global power in rejecting the intersection of all individual hypotheses. Compared with closed tests EMinP tests have the computational advantage by sharing the benefit of the stepdown procedure of MinP tests and can have a better global power over the tests used to construct closed tests. Furthermore, we argue that EMinP tests may be viewed as a tool to prevent data snooping when two competing tests that have distinct global powers are exploited. Finally, the proposed tests are applied to an empirical application on testing the effects of exercise. |
Date: | 2019–11 |
URL: | http://d.repec.org/n?u=RePEc:arx:papers:1911.04696&r=all |
By: | Sanz, Carlos; Gonzalo Muñoz, Jesus; Alloza, Mario |
Abstract: | We show that several shocks identified without restrictions from a model, and frequently used in the empirical literature, display some persistence. We demonstrate that the two leading methods to recover impulse responses to shocks (moving average representations and local projections) treat persistence differently, hence identifying different objects. In particular, standard local projections identify responses that includean effect due to the persistence of the shock, while moving average representations implicitly account for it. We propose methods to re-establish the equivalence between local projections and moving average representations. In particular, the inclusion ofleads of the shock in local projections allows to control for its persistence and rendersthe resulting responses equivalent to those associated to counterfactual non-serially correlated shocks. We apply this method to well-known empirical work on fiscal andmonetary policy and find that accounting for persistence has a sizable impact on the estimates of dynamic effects. |
Keywords: | Monetary Policy; Shock, Fiscal Policy; Local Projection; Impulse Response Function |
JEL: | E62 E52 E32 C32 |
Date: | 2019–11–18 |
URL: | http://d.repec.org/n?u=RePEc:cte:werepe:29187&r=all |
By: | Adrian, Tobias (International Monetary Fund); Boyarchenko, Nina (Federal Reserve Bank of New York); Giannone, Domenico (Amazon.com, Inc.) |
Abstract: | We estimate the evolution of the conditional joint distribution of economic and financial conditions in the United States, documenting a novel empirical fact: while the joint distribution is approximately Gaussian during normal periods, sharp tightenings of financial conditions lead to the emergence of additional modes—that is, multiple economic equilibria. Although the U.S. economy has historically reverted quickly to a “good” equilibrium after a tightening of financial conditions, we conjecture that poor policy choices under these circumstances could also open a pathway to a “bad” equilibrium for a prolonged period. We argue that such multimodality arises naturally in a macro-financial intermediary model with occasionally binding intermediary constraints. |
Keywords: | density impulse response; multimodality; nonparametric density estimator |
JEL: | C14 E17 E37 G01 |
Date: | 2019–11–01 |
URL: | http://d.repec.org/n?u=RePEc:fip:fednsr:903&r=all |
By: | R Verbelen; K Antonio; Gerda Claeskens; J Crevecoeur |
Date: | 2018 |
URL: | http://d.repec.org/n?u=RePEc:ete:afiper:623951&r=all |