
on Econometrics 
By:  Jerry A. Hausman; Haoyang Liu; Ye Luo; Christopher Palmer 
Abstract:  The popular quantile regression estimator of Koenker and Bassett (1978) is biased if there is an additive error term. Approaching this problem as an errorsinvariables problem where the dependent variable suffers from classical measurement error, we present a sieve maximumlikelihood approach that is robust to lefthand side measurement error. After providing sufficient conditions for identification, we demonstrate that when the number of knots in the quantile grid is chosen to grow at an adequate speed, the sieve maximumlikelihood estimator is consistent and asymptotically normal, permitting inference via bootstrapping. We verify our theoretical results with Monte Carlo simulations and illustrate our estimator with an application to the returns to education highlighting changes over time in the returns to education that have previously been masked by measurementerror bias. 
JEL:  C19 C21 C31 I24 J30 
Date:  2019–05 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:25819&r=all 
By:  Joel Horowitz; Sokbae Lee 
Abstract:  This paper describes a method for carrying out nonasymptotic inference on partially identified parameters that are solutions to a class of optimization problems. The optimization problems arise in applications in which grouped data are used for estimation of a model's structural parameters. The parameters are characterized by restrictions that involve the population means of observed random variables in addition to the structural parameters of interest. Inference consists of finding confidence intervals for the structural parameters. Our method is nonasymptotic in the sense that it provides a finitesample bound on the difference between the true and nominal probabilities with which a confidence interval contains the true but unknown value of a parameter. We contrast our method with an alternative nonasymptotic method based on the medianofmeans estimator of Minsker (2015). The results of Monte Carlo experiments and an empirical example illustrate the usefulness of our method. 
Date:  2019–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1905.06491&r=all 
By:  Takayuki Toda; Ayako Wakano; Takahiro Hoshino 
Abstract:  We propose a new estimation method for heterogeneous causal effects which utilizes a regression discontinuity (RD) design for multiple datasets with different thresholds. The standard RD design is frequently used in applied researches, but the result is very limited in that the average treatment effects is estimable only at the threshold on the running variable. In application studies it is often the case that thresholds are different among databases from different regions or firms. For example thresholds for scholarship differ with states. The proposed estimator based on the augmented inverse probability weighted local linear estimator can estimate the average effects at an arbitrary point on the running variable between the thresholds under mild conditions, while the method adjust for the difference of the distributions of covariates among datasets. We perform simulations to investigate the performance of the proposed estimator in the finite samples. 
Date:  2019–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1905.04443&r=all 
By:  Hauber, Philipp; Schumacher, Christian; Zhang, Jiachun 
Abstract:  We provide a simulation smoother to a exible statespace model with lagged states and lagged dependent variables. Qian (2014) has introduced this statespace model and proposes a fast Kalman filter with timevarying state dimension in the presence of missing observations in the data. In this paper, we derive the corresponding Kalman smoother moments and propose an efficient simulation smoother, which relies on mean corrections for unconditional vectors. When applied to a factor model, the proposed simulation smoother for the states is efficient compared to other statespace models without lagged states and/or lagged dependent variables in terms of computing time. 
Keywords:  statespace model,missing observations,Kalman filter and smoother,simulation smoothing,factor model 
JEL:  C11 C32 C38 C63 
Date:  2019 
URL:  http://d.repec.org/n?u=RePEc:zbw:bubdps:152019&r=all 
By:  Masaaki Fukasawa; Tetsuya Takabatake; Rebecca Westphal 
Abstract:  Rough volatility models are continuous time stochastic volatility models where the volatility process is driven by a fractional Brownian motion with the Hurst parameter less than half, and have attracted much attention since a seminal paper titled "Volatility is rough" was posted on SSRN in 2014 claiming that they explain a scaling property of realized variance time series. From our point of view, the analysis is not satisfactory because the estimation error of the latent volatility was not taken into account; we show by simulations that it in fact results in a fake scaling property. Motivated by this preliminary finding, we construct a quasilikelihood estimator for a fractional stochastic volatility model and apply it to realized variance time series to examine whether the volatility is really rough. Our quasilikelihood is based on a central limit theorem for the realized volatility estimation error and a Whittletype approximation to the autocovariance of the logvolatility process. We prove the consistency of our estimator under high frequency asymptotics, and examine by simulations the finite sample performance of our estimator. Our empirical study suggests that the volatility is indeed rough; actually it is even rougher than considered in the literature. 
Date:  2019–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1905.04852&r=all 
By:  Perron, Pierre; Yamamoto, Yohei 
Abstract:  In empirical applications based on linear regression models, structural changes often occur in both the error variance and regression coefficients possibly at different dates. A commonly applied method is to first test for changes in the coefficients (or in the error variance) and, conditional on the break dates found, test for changes in the variance (or in the coefficients). In this note, we provide evidence that such procedures have poor finite sample properties when the changes in the first step are not correctly accounted for. In doing so, we show that the test for changes in the coefficients (or in the variance) ignoring changes in the variance (or in the coefficients) induces size distortions and loss of power. Our results illustrate a need for a joint approach to test for structural changes in both the coefficients and the variance of the errors. We provide some evidence that the procedures suggested by Perron, Yamamoto and Zhou (2019) provide tests with good size and power. 
Keywords:  structural change, variance shifts, CUSUM of squares tests, hypothesis testing, SupLR test 
JEL:  C12 C38 
Date:  2019–04 
URL:  http://d.repec.org/n?u=RePEc:hit:econdp:201901&r=all 
By:  Perron, Pierre; Yamamoto, Yohei; Zhou, Jing 
Abstract:  We provide a comprehensive treatment for the problem of testing jointly for structural changes in both the regression coeffcients and the variance of the errors in a single equation system involving stationary regressors. Our framework is quite general in that we allow for general mixingtype regressors and the assumptions on the errors are quite mild. Their distribution can be nonNormal and conditional heteroskedasticity is permitted. Extensions to the case with serially correlated errors are also treated. We provide the required tools to address the following testing problems, among others: a) testing for given numbers of changes in regression coeffcients and variance of the errors; b) testing for some unknown number of changes within some prespecified maximum; c) testing for changes in variance (regression coeffcients) allowing for a given number of changes in the regression coeffcients (variance); d) a sequential procedure to estimate the number of changes present. These testing problems are important for practical applications as witnessed by interests in macroeconomics and finance where documenting structural changes in the variability of shocks to simple autoregressions or Vector Autoregressive Models has been a concern. 
Keywords:  Changepoint, Variance shift, Conditional heteroskedasticity, Likelihood ratio tests 
JEL:  C22 
Date:  2019–04 
URL:  http://d.repec.org/n?u=RePEc:hit:hiasdp:hiase85&r=all 
By:  Fischer, Christoph 
Abstract:  Equilibrium real exchange rate and corresponding misalignment estimates differ tremendously depending on the panel estimation method used to derive them. Essentially, these methods differ in their treatment of the timeseries (time) and the crosssection (space) variation in the panel. The study shows that conventional panel estimation methods (pooled OLS, fixed, random, and between effects) can be interpreted as restricted versions of a correlated random effects (CRE) model. It formally derives the distortion that arises if these restrictions are violated and uses two empirical applications from the literature to show that the distortion is generally very large. This suggests the use of the CRE model for the panel estimation of equilibrium real exchange rates and misalignments. 
Keywords:  equilibrium real exchange rate,panel estimation method,correlated random effects model,productivity approach,BEER,price competitiveness 
JEL:  F31 C23 
Date:  2019 
URL:  http://d.repec.org/n?u=RePEc:zbw:bubdps:142019&r=all 
By:  Régis Barnichon; Geert Mesters 
Abstract:  Despite decades of research, the consistent estimation of structural forward looking macroeconomic equations remains a formidable empirical challenge because of pervasive endogeneity issues. Prominent cases the estimation of Phillips curves, of Euler equations for consumption or output, or of monetary policy rules have typically relied on using predetermined variables as instruments, with mixed success. In this work, we propose a new approach that consists in using sequences of independently identi ed structural shocks as instrumental variables. Our approach is robust to weak instruments and is valid regardless of the shocks' variance contribution. We estimate a Phillips curve using monetary shocks as instruments and nd that conventional methods (i) substantially underestimate the slope of the Phillips curve and (ii) overestimate the role of forwardlooking in ation expectations. 
Keywords:  Structural equations, instrumental variables, impulse responses, robust inference. 
JEL:  C14 C32 E32 E52 
Date:  2019–05 
URL:  http://d.repec.org/n?u=RePEc:upf:upfgen:1659&r=all 
By:  Muhummad Amjad; Vishal Misra; Devavrat Shah; Dennis Shen 
Abstract:  When evaluating the impact of a policy on a metric of interest, it may not be possible to conduct a randomized control trial. In settings where only observational data is available, Synthetic Control (SC) methods provide a popular datadriven approach to estimate a "synthetic" control by combining measurements of "similar" units (donors). Recently, Robust SC (RSC) was proposed as a generalization of SC to overcome the challenges of missing data high levels of noise, while removing the reliance on domain knowledge for selecting donors. However, SC, RSC, and their variants, suffer from poor estimation when the preintervention period is too short. As the main contribution, we propose a generalization of unidimensional RSC to multidimensional RSC, mRSC. Our proposed mechanism incorporates multiple metrics to estimate a synthetic control, thus overcoming the challenge of poor inference from limited preintervention data. We show that the mRSC algorithm with $K$ metrics leads to a consistent estimator of the synthetic control for the target unit under any metric. Our finitesample analysis suggests that the prediction error decays to zero at a rate faster than the RSC algorithm by a factor of $K$ and $\sqrt{K}$ for the training and testing periods (pre and postintervention), respectively. Additionally, we provide a diagnostic test that evaluates the utility of including additional metrics. Moreover, we introduce a mechanism to validate the performance of mRSC: time series prediction. That is, we propose a method to predict the future evolution of a time series based on limited data when the notion of time is relative and not absolute, i.e., we have access to a donor pool that has undergone the desired future evolution. Finally, we conduct experimentation to establish the efficacy of mRSC on synthetic data and two realworld case studies (retail and Cricket). 
Date:  2019–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1905.06400&r=all 
By:  Ferman, Bruno 
Abstract:  We analyze the conditions in which ignoring spatial correlation is problematic for inference in differencesindifferences (DID) models. Assuming that the spatial correlation structure follows a linear factor model, we show that inference ignoring such correlation remains reliable when either (i) the second moment of the difference between the pre and posttreatment averages of common factors is low, or (ii) the distribution of factor loadings has the same expected values for treated and control groups, and do not exhibit significant spatial correlation. We present simulation results with real datasets that corroborate these conclusions. Our results provide important guidelines on how to minimize inference problems due to spatial correlation in DID applications. 
Keywords:  inference, differencesindifferences, spatial correlation 
JEL:  C12 C21 C23 C33 
Date:  2019–05–08 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:93746&r=all 
By:  Naimoli, Antonio; Storti, Giuseppe 
Abstract:  We propose a novel approach to modelling and forecasting high frequency trading volumes. The new model extends the Component Multiplicative Error Model of Brownlees et al. (2011) by introducing a more flexible specification of the longrun component. This uses an additive cascade of MIDAS polynomial filters, moving at different frequencies, in order to reproduce the changing longrun level and the persistent autocorrelation structure of high frequency trading volumes. After investigating its statistical properties, the merits of the proposed approach are illustrated by means of an application to six stocks traded on the XETRA market in the German Stock Exchange. 
Keywords:  Intradaily trading volume, dynamic component models, longrange dependence, forecasting. 
JEL:  C22 C53 C58 
Date:  2019–05–09 
URL:  http://d.repec.org/n?u=RePEc:pra:mprapa:93802&r=all 
By:  Pietro Tebaldi; Alexander Torgovitsky; Hanbin Yang 
Abstract:  We estimate the demand for health insurance in the California Affordable Care Act marketplace (Covered California) without using parametric assumptions about the unobserved components of utility. To do this, we develop a computational method for constructing sharp identified sets in a nonparametric discrete choice model. The model allows for endogeneity in prices (premiums) and for the use of instrumental variables to address this endogeneity. We use the method to estimate bounds on the effects of changing premium subsidies on coverage choices, consumer surplus, and government spending. We find that a $10 decrease in monthly premium subsidies would cause between a 1.6% and 7.0% decline in the proportion of lowincome adults with coverage. The reduction in total annual consumer surplus would be between $63 and $78 million, while the savings in yearly subsidy outlays would be between $238 and $604 million. Comparable logit models yield price sensitivity estimates towards the lower end of the bounds. 
JEL:  C14 C3 C5 I13 
Date:  2019–05 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:25827&r=all 
By:  Takashi Yamashita; Ryozo Miura 
Abstract:  This paper introduces a new correction scheme to a conventional regressionbased event study method: a topological machinelearning approach with a selforganizing map (SOM).We use this new scheme to analyze a major market event in Japan and find that the factors of abnormal stock returns can be easily can be easily identified and the eventcluster can be depicted.We also find that a conventional event study method involves an empirical analysis mechanism that tends to derive bias due to its mechanism, typically in an eventclustered market situation. We explain our new correction scheme and apply it to an event in the Japanese market  the holding disclosure of the Government Pension Investment Fund (GPIF) on July 31, 2015. 
Date:  2019–05 
URL:  http://d.repec.org/n?u=RePEc:arx:papers:1905.06536&r=all 