
on Econometrics 
By:  Lorenzo Fattorini; Caterina Pisani; Francesco Riga; Marco Zaccaroni 
Abstract:  The analysis of habitat use in radiotagged animals is approached by comparing the portions of use vs the portions of availability observed for each habitat type. Since data are linearly dependent with singular variancecovariance matrices, standard multivariate statistical test cannot be applied. To overcome the problem, compositional data analysis is customary performed via logratio transform of sample observations. The procedure is criticized in this paper, emphasizing the many drawbacks which may arise from the use of compositional analysis. An alternative nonparametric solution is proposed in the framework of multiple testing. The habitat use is assessed separately for each habitat type by means of the sign test performed on the original observations. The resulting pvalues are combined in an overall test statistic whose significance is determined permuting sample observations. The theoretical findings of the paper are checked by simulation studies. Applications to some case studies are considered. 
Keywords:  compositional data analysis, Johnson’s second order selection, Johnson’s third order selection, Monte Carlo studies, multiple testing, random habitat use. 
JEL:  C12 
Date:  2011–10 
URL:  http://d.repec.org/n?u=RePEc:usi:wpaper:622&r=ecm 
By:  Frank A. Cowell (STICERD  London School of Economics); Russell Davidson (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales (EHESS)  CNRS : UMR6579); Emmanuel Flachaire (GREQAM  Groupement de Recherche en Économie Quantitative d'AixMarseille  Université de la Méditerranée  AixMarseille II  Université Paul Cézanne  AixMarseille III  Ecole des Hautes Etudes en Sciences Sociales (EHESS)  CNRS : UMR6579) 
Abstract:  An axiomatic approach is used to develop a oneparameter family of measures of divergence between distributions. These measures can be used to perform goodnessoffit tests with good statistical properties. Asymptotic theory shows that the test statistics have welldefined limiting distributions which are however analytically intractable. A parametric bootstrap procedure is proposed for implementation of the tests. The procedure is shown to work very well in a set of simulation experiments, and to compare favourably with other commonly used goodnessoffit tests. By varying the parameter of the statistic, one can obtain information on how the distribution that generated a sample diverges from the target family of distributions when the true distribution does not belong to that family. An empirical application analyses a UK income data set. 
Keywords:  Goodness of fit; axiomatic approach; measures of divergence; parametric bootstrap 
Date:  2011–11–08 
URL:  http://d.repec.org/n?u=RePEc:hal:wpaper:halshs00639075&r=ecm 
By:  Xibin Zhang; Maxwell L. King 
Abstract:  This paper aims to investigate a Bayesian sampling approach to parameter estimation in the semiparametric GARCH model with an unknown conditional error density, which we approximate by a mixture of Gaussian densities centered at individual errors and scaled by a common standard deviation. This mixture density has the form of a kernel density estimator of the errors with its bandwidth being the standard deviation. The proposed investigation is motivated by the lack of robustness in GARCH models with any parametric assumption of the error density for the purpose of errordensity based inference such as valueatrisk (VaR) estimation. The contribution of the paper is to construct the likelihood and posterior of model and bandwidth parameters under the proposed mixture error density, and to forecast the onestep outofsample density of asset returns. The resulting VaR measure therefore would be distributionfree. Applying the semiparametric GARCH(1,1) model to daily stockindex returns in eight stock markets, we find that this semiparametric GARCH model is favoured against the GARCH(1,1) model with Student t errors for five indices, and that the GARCH model underestimates VaR compared to its semiparametric counterpart. We also investigate the use and benefit of localized bandwidths in the proposed mixture density of the errors. 
Keywords:  Bayes factors, kernelform error density, localized bandwidths, Markov chain Monte Carlo, valueatrisk 
JEL:  C11 C14 C15 G15 
Date:  2011–11–03 
URL:  http://d.repec.org/n?u=RePEc:msh:ebswps:201124&r=ecm 
By:  Catherine Dehon; Marjorie Gassner; Vincenzo Verardi 
Abstract:  In this paper, we follow the same logic as in Hausman (1978) to create a testing procedure that checks for the presence of outliers by comparing a regression estimator that is robust to outliers (Sestimator), with another that is more e¢ cient but a¤ected by them. Some simulations are presented to illustrate the good behavior of the test for both its size and its power. 
Keywords:  Sestimators; MMestimators; Outliers; Linear regression; Generalized Method of Moments; Robustness 
JEL:  C12 C21 H11 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2013/102578&r=ecm 
By:  Rothe, Christoph (Toulouse School of Economics) 
Abstract:  In this paper, we propose a method to evaluate the effect of a counterfactual change in the unconditional distribution of a single covariate on the unconditional distribution of an outcome variable of interest. Both fixed and infinitesimal changes are considered. We show that such effects are point identified under general conditions if the covariate affected by the counterfactual change is continuously distributed, but are typically only partially identified if its distribution is discrete. For the latter case, we derive informative bounds making use of the available information. We also discuss estimation and inference. 
Keywords:  counterfactual distribution, partial identification, nonseparable model 
JEL:  C14 C31 
Date:  2011–10 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp6076&r=ecm 
By:  Mammen, Enno (University of Mannheim); Rothe, Christoph (Toulouse School of Economics); Schienle, Melanie (Humboldt University, Berlin) 
Abstract:  In this paper, we study a general class of semiparametric optimization estimators of a vectorvalued parameter. The criterion function depends on two types of infinitedimensional nuisance parameters: a conditional expectation function that has been estimated nonparametrically using generated covariates, and another estimated function that is used to compute the generated covariates in the first place. We study the asymptotic properties of estimators in this class, which is a nonstandard problem due to the presence of generated covariates. We give conditions under which estimators are rootn consistent and asymptotically normal, and derive a general formula for the asymptotic variance. 
Keywords:  semiparametric estimation, generated covariates, profiling, propensity score 
JEL:  C14 C31 
Date:  2011–10 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp6084&r=ecm 
By:  Picchio, Matteo (Tilburg University) 
Abstract:  We study the nonparametric identification of a mixed proportional hazard model with lagged duration dependence when data provide multiple outcomes per individual or stratum. We show that the information conveyed by the within strata variation can be exploited to nonparametrically identify lagged duration dependence in more general models than in the literature. 
Keywords:  lagged duration dependence, mixed proportional hazard models, identification, multiple spells, parallel data 
JEL:  C14 C41 
Date:  2011–10 
URL:  http://d.repec.org/n?u=RePEc:iza:izadps:dp6089&r=ecm 
By:  John Mullahy 
Abstract:  Estimation of marginal or partial effects of covariates x on various conditional parameters or functionals is often the main target of applied microeconometric analysis. In the specific context of probit models such estimation is straightforward in univariate models, and Greene, 1996, 1998, has extended these results to cover the case of quadrant probability marginal effects in bivariate probit models. The purpose of this paper is to extend these results to the general multivariate probit context for arbitrary orthant probabilities and to demonstrate the applicability of such extensions in contexts of interest in health economics applications. The baseline results are extended to models that condition on subvectors of y, to count data structures that derive from the probability structure of y, to multivariate ordered probit data structures, and to multinomial probit models whose marginal effects turn out to be a special case of those of the multivariate probit model. Simulations reveal that analytical formulae versus fully numerical derivatives result in a reduction in computational time as well as an increase in accuracy. 
JEL:  C35 I1 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:17588&r=ecm 
By:  Christophe Ley; Mitia Duerinckx 
Abstract:  A classical characterization result, which can be traced back to Gauss, states that the maximum likelihood estimator (MLE) of the location parameter equals the sample mean for any possible univariate samples of any possible sizes n if and only if the samples are drawn from a Gaussian population. A similar result, in the twodimensional case, is given in von Mises (1929) for the Fishervon MisesLangevin (FVML) distribution, the equivalent of the Gaussian law on the unit circle. Half a century later, Bingham and Mardia (1975) extend the result to FVML distributions on the unit sphere S<sup>k1</sup> := fv 2 R<sup>k</sup> :v'v = 1g, k >= 2. In this paper, we present a general MLE characterization theorem for a large subclass of rotationally symmetric distributions on S<sup>k1</sup>, k >= 2, including the FVML distribution. 
Keywords:  Cauchy's finctional equation; characterization theorem; Fishervon MisesLangevin distribution; Maximum likelihood estimator; Rationally symmetric distributions on the sphere 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2013/102302&r=ecm 
By:  Yacine AitSahalia; Jianqing Fan; Yingying Li 
Abstract:  The leverage effect refers to the generally negative correlation between an asset return and its changes of volatility. A natural estimate consists in using the empirical correlation between the daily returns and the changes of daily volatility estimated from highfrequency data. The puzzle lies in the fact that such an intuitively natural estimate yields nearly zero correlation for most assets tested, despite the many economic reasons for expecting the estimated correlation to be negative. To better understand the sources of the puzzle, we analyze the different asymptotic biases that are involved in high frequency estimation of the leverage effect, including biases due to discretization errors, to smoothing errors in estimating spot volatilities, to estimation error, and to market microstructure noise. This decomposition enables us to propose novel bias correction methods for estimating the leverage effect. 
JEL:  C22 G12 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberwo:17592&r=ecm 
By:  Blasques, Francisco Albergaria Amaral (Maastricht University) 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:ner:maastr:urn:nbn:nl:ui:2727549&r=ecm 
By:  H. Evans;; A. Basu; 
Abstract:  We highlight the role of local instrumental variable (LIV) methods in exploring treatment effect heterogeneity using an empirical example of evaluating the use versus nonuse of prehospital intubation (PHI) in patients with traumatic injury on inpatient mortality. We find evidence that the effect of PHI on inpatient mortality varies over levels of unobserved confounders giving rise to a phenomenon known as essential heterogeneity. Under essential heterogeneity, the traditional instrumental variable (IV) method, when using a continuous IV, estimates an effect that is an arbitrary weighted average of the casual effects for marginal groups of patients whose PHI receipt are directly influenced by the IV levels. Instead, the LIV methods estimate the distribution of treatment effects for every margin that is identified by data and allow for predictable aggregation to recover estimates for meaningful treatment effect parameters such as the Average Treatment Effect (ATE) and the Effect on the Treated (TT). LIV methods also allow exploring heterogeneity in treatment effects over levels of observed confounders. In the PHI analysis, we estimate an ATE of 0.074. We find strong evidence of positive selfselection in practice based on observed and unobserved characteristics, whereby patients who were most likely to be harmed by PHI were also less likely to receive PHI. However, the degree of positive selfselection mitigates in regions with higher rates of PHI use. We also explore factors associated with the prediction of significant harm by PHI. We provide clinical interpretation of results and discuss the importance of these methods in the context of comparative effectiveness research. 
Keywords:  Instrumental variables; local IV methods; heterogeneity; prehospital intubation; mortality 
Date:  2011–08 
URL:  http://d.repec.org/n?u=RePEc:yor:hectdg:11/26&r=ecm 
By:  George Tauchen 
Abstract:  In this paper we present parametric estimation of models for stock returns by describing price dynamic as the sum of two independent Levy components. The increments (moves) are viewed as discretetime log price changes that follow an infinitely divisible distribution, i.e. stationary and independent price changes (zero drift) that follow a Levytype distribution. We explore empirical plausibility of two parametric models: JumpDiffusion (CJ) and pure jump model (TSJ). The first process describes dynamics of small frequent moves and is modeled by Brownian motion in CJ model and by tempered stable Levy process in TSJ model. The second process is responsible for big rare moves in asset prices and is modeled by compound Poisson process in both models. The estimation is performed via continuously updated GMM by matching the characteristic function implied by the model with the observed characteristic function. Using high frequency data on 13 stocks of different market capitalization for 20062008 sample period we find that CJ model performs well only for large cap stocks, while medium cap stock dynamics are captured by TSJ model. We also report evidence of positive relation between activity index of the process for stock returns and its frequency of trading. 
JEL:  C51 C52 G12 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:duk:dukeec:1122&r=ecm 
By:  Christophe Ley; YvesCaoimhin Swan 
Abstract:  Abstract: We generalize the socalled density approach to Stein character izations of probability distributions. We prove an elementary factorization property of the resulting Stein operator in terms of a generalized (standard ized) score function. We use this result to connect Stein characterizations with information distances such as the generalized (standardized) Fisher information. 
Keywords:  density approach; generalized (standardized) Fisher information; generalized (standardized) score functions; information functionals; probability metrics 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:eca:wpaper:2013/102561&r=ecm 
By:  Charles A. Fleischman; John M. Roberts 
Abstract:  We construct new estimates of potential output and the output gap using a multivariate approach that allows for an explicit role for measurement errors in the decomposition of real output. Because we include data on hours, output, employment, and the labor force, we are able to decompose our estimate of potential output into separate trends in labor productivity, laborforce participation, weekly hours, and the NAIRU. We find that labormarket variables—especially the unemployment rate—are the most informative individual indicators of the state of the business cycle. Conditional on including these measures, inflation is also very informative. Among measures of output, we find that although they add little to the identification for the cycle, the incomeside measures of output are about as informative as the traditional productside measures about the level of structural productivity and potential output. We also find that the output gap resulting from the recent financial crisis was very large, reaching 7 percent of output in the second half of 2009. 
Date:  2011 
URL:  http://d.repec.org/n?u=RePEc:fip:fedgfe:201146&r=ecm 
By:  JeanSébastien Pentecôte, University of Rennes 1  CREMCNRS 
Abstract:  Bayoumi and Eichengreen’s (BE, 1994) article has been very influent in the empirics of the coreperiphery view of fixed exchange rate agreements. They rely on the basic ASAD macroeconomic model in order to identify supply and demand shocks through longrun restrictions in vector autoregressions. Doing this should enable one to assess the size of such disturbances and the asymmetry between countries. While reference is usually made to Blanchard and Quah (BQ, 1989), it is shown here how this factorization has been modified by BE and how the two resulting decomposition schemes can be linked. Contrary to BE’s premise, relaxing the assumption of shocks of equal size is not just a matter of scale. The empirical properties of the exchange regime are modified, especially as regards the correlation of shocks. Given the VAR setting used in the related studies, it is also established that zeroconstraints on either instantaneous or longrun impulse responses provide identical results. An empirical assessment he euro currency area over 19962008 illustrate these points. The recorded evidence suggests that nonzero restrictions imply slope coefficients of the AS and AD curves close to values derived from NewKeynesian models. 
Keywords:  Fixed exchange rates, coreperiphery, longrun restrictions, structural VARs 
JEL:  C32 E13 F33 
Date:  2011–11 
URL:  http://d.repec.org/n?u=RePEc:tut:cremwp:201125&r=ecm 