
on Econometrics 
By:  Michael W. Brandt (Department of Finance, University of Pennsylvania, and NBER); Francis X. Diebold (Departments of Economics, Finance and Statistics, University of Pennsylvania, and NBER) 
Abstract:  We extend the important idea of rangebased volatility estimation to the multivariate case. In particular, we propose a rangebased covariance estimator that is motivated by financial economic considerations (the absence of arbitrage), in addition to statistical considerations. We show that, unlike other univariate and multivariate volatility estimators, the rangebased estimator is highly efficient yet robust to market microstructure noise arising from bidask bounce and asynchronous trading. Finally, we provide an empirical example illustrating the value of the highfrequency sample path information contained in the rangebased estimates in a multivariate GARCH framework. 
Keywords:  Rangebased estimation, volatility, covariance, correlation, absence of arbitrage, exchange rates, stock returns, bond returns, bidask bounce, asynchronous trading 
Date:  2004–01–07 
URL:  http://d.repec.org/n?u=RePEc:cfs:cfswop:wp200407&r=ecm 
By:  Peter F. Christoffersen (McGill University and CIRANO); Francis X. Diebold (University of Pennsylvania and NBER) 
Abstract:  We consider three sets of phenomena that feature prominently – and separately – in the financial economics literature: conditional mean dependence (or lack thereof) in asset returns, dependence (and hence forecastability) in asset return signs, and dependence (and hence forecastability) in asset return volatilities. We show that they are very much interrelated, and we explore the relationships in detail. Among other things, we show that: (a) Volatility dependence produces sign dependence, so long as expected returns are nonzero, so that one should expect sign dependence, given the overwhelming evidence of volatility dependence; (b) The standard finding of little or no conditional mean dependence is entirely consistent with a significant degree of sign dependence and volatility dependence; (c) Sign dependence is not likely to be found via analysis of sign autocorrelations, runs tests, or traditional market timing tests, because of the special nonlinear nature of sign dependence; (d) Sign dependence is not likely to be found in very highfrequency (e.g., daily) or very lowfrequency (e.g., annual) returns; instead, it is more likely to be found at intermediate return horizons; (e) Sign dependence is very much present in actual U.S. equity returns, and its properties match closely our theoretical predictions; (f) The link between volatility forecastability and sign forecastability remains intact in conditionally nonGaussian environments, as for example with timevarying conditional skewness and/or kurtosis. 
Date:  2004–01–08 
URL:  http://d.repec.org/n?u=RePEc:cfs:cfswop:wp200408&r=ecm 
By:  Anat Bracha; Jeremy Gray (Dept. of Psychology, Yale University); Rustam Ibragimov; Boaz Nadler (Dept. of Mathematics, Yale University); Dmitry Shapiro; Glena Ames (Cowles Foundation, Yale University); Donald J. Brown (Cowles Foundation, Yale University) 
Abstract:  This paper proposes nonparametric statistical procedures for analyzing discrete choice models of affective decision making. We make two contributions to the literature on behavioral economics. Namely, we propose a procedure for eliciting the existence of a Nash equilibrium in an intrapersonal, potential game as well as randomized sign tests for dependent observations on gametheoretic models of affective decision making. This methodology is illustrated in the context of a hypothetical experiment  the Casino Game. 
Keywords:  Behavioral economics, Affective decision making, Intrapersonal potential games, Randomized sign tests, Dependent observations, Adapted sequences, Martingaledifference sequences 
JEL:  C12 C32 C35 C72 C91 D11 D81 
Date:  2005–06 
URL:  http://d.repec.org/n?u=RePEc:cwl:cwldpp:1526&r=ecm 
By:  Daiji Kawaguchi; Hisahiro Naito 
Abstract:  We propose an efficient moment estimator for the probit model with a continuous endogenous regressor. The estimation can be readily implemented using a standard statistical package that can estimate a nonlinear system twostage least squares (instrumental variable) estimator. 
Keywords:  Probit, Continuous endogenous regressor, Moment estimation 
JEL:  C25 
Date:  2005–06 
URL:  http://d.repec.org/n?u=RePEc:hst:hstdps:d05106&r=ecm 
By:  Roy Cerqueti and Mauro Costantini 
Abstract:  This paper provides a further generalization of cointegration tests in a nonparametric setting. We adopt Bierens' approach in order to give an extension for processes I(d), with a fixed integer d. A generalized eigenvalue problem is solved, and the test statistics involved are obtained starting from two matrices that are independent on the data generating process. The mathematical tools we adopt are related to the asymptotic theory of the stochastic processes. The key point of our work is linked to the distinguishing between the stationary and nonstationary part of an integrated process. 
Keywords:  Multivariate analysis, Nonparametric methods, Cointegration, Asymptotic properties. 
JEL:  C14 C32 
Date:  2005–07–12 
URL:  http://d.repec.org/n?u=RePEc:mol:ecsdps:esdp05026&r=ecm 
By:  Timothy Conley; Christopher Taber 
Abstract:  Difference in differences methods have become very popular in applied work. This paper provides a new method for inference in these models when there are a small number of policy changes. This situation occurs in many implementations of these estimators. Identification of the key parameter typically arises when a group "changes" some particular policy. The asymptotic approximations that are typically employed assume that the number of cross sectional groups, N, times the number of time periods, T, is large. However, even when N or T is large, the number of actual policy changes observed in the data is often very small. In this case, we argue that point estimators of treatment effects should not be thought of as being consistent and that the standard methods that researchers use to perform inference in these models are not appropriate. We develop an alternative approach to inference under the assumption that there are a finite number of policy changes in the data, using asymptotic approximations as the number of nonchanging groups gets large. In this situation we cannot obtain a consistent point estimator for the key treatment effect parameter. However, we can consistently estimate the finitesample distribution of the treatment effect estimator, up to the unknown parameter itself. This allows us to perform hypothesis tests and construct confidence intervals. For expositional and motivational purposes, we focus on the difference in differences case, but our approach should be appropriate more generally in treatment effect models which employ a large number of controls, but a small number of treatments. We demonstrate the use of the approach by analyzing the effect of college merit aide programs on college attendance. We show that in some cases the standard approach can give misleading results. 
Date:  2005–07 
URL:  http://d.repec.org/n?u=RePEc:nbr:nberte:0312&r=ecm 
By:  Christopher L. Gilbert (Università degli Studi di Trento); Duo Qin (Queen Mary, University of London) 
Abstract:  We characterize modern econometrics in terms of the emergence a widely accepted analytical framework. A major theme which dominated much of the debate through the century was whether and how econometric models can reflect theorygenerated economic structures. In the period prior to the 2nd world war, economists adopted a wide variety of analytical methods, some ad hoc but others reflecting advances in statistical methodology. Business cycle analysis and demand analysis were the two major areas in which statistical theory was employed. Methods became increasingly formalized but problems of data adequacy, estimation and identification were not always well distinguished. During and immediately after the war, Cowles Commission research sought to base econometrics on autonomous probabilistic models specified in terms of underlying structural parameters. Least squares would not normally be consistent in such models and maximum likelihood estimation was to be preferred. Subsequently, however, the pendulum swung back towards least squaresbased methods and this was reflected ion the textbook expositions of what was accepted as standard econometrics in the late sixties and early seventies. Subsequently, the paradigm was undermined by the challenges imposed by rational expectations modelling, which challenged standard identification assumptions, and by the poor forecasting performance of many macroeconomic models by comparison with black box time series competitors. The result was a revival of nonstructural modelling, particularly in the analysis of macroeconomic data. 
Keywords:  Econometrics, History, Estimation, Identification. 
JEL:  B23 C10 
Date:  2005–07 
URL:  http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp544&r=ecm 
By:  Yves Atchade (Department of Mathematics and Statistics, University of Ottawa and LRSP) 
Abstract:  This paper proposes an adaptive version for the Metropolis adjusted Langevin algorithm with a truncated drift (TMALA). The scale parameter and the covariance matrix of the proposal kernel of the algorithm are simultaneously and recursively updated in order to reach the optimal acceptance rate of 0:574 (see Roberts and Rosenthal (2001)) and to estimate and use the correlation structure of the target distribution. We develop some convergence results for the algorithm. A simulation example is presented. 
Keywords:  Markov Chain Monte Carlo, Stochastic approximation algorithms, Metropolis Adjusted Langevin algorithm, geometric rate of convergence. 
JEL:  C10 C40 
Date:  2005–03–01 
URL:  http://d.repec.org/n?u=RePEc:pqs:wpaper:0272005&r=ecm 