nep-ecm New Economics Papers
on Econometrics
Issue of 2005‒07‒18
eight papers chosen by
Sune Karlsson
Orebro University

  1. A No-Arbitrage Approach to Range-Based Estimation of Return Covariances and Correlations By Michael W. Brandt; Francis X. Diebold
  2. Financial Asset Returns, Direction-of-Change Forecasting, and Volatility Dynamics By Peter F. Christoffersen; Francis X. Diebold
  3. Randomized Sign Test for Dependent Observations on Discrete Choice under Risk By Anat Bracha; Jeremy Gray; Rustam Ibragimov; Boaz Nadler; Dmitry Shapiro; Glena Ames; Donald J. Brown
  4. The efficient moment estimation of the probit model with an endogenous continuous regressor By Daiji Kawaguchi; Hisahiro Naito
  5. Generalization of a nonparametric co-integration analysis for multivariate integrated processes of an integer order. By Roy Cerqueti and Mauro Costantini
  6. Inference with "Difference in Differences" with a Small Number of Policy Changes By Timothy Conley; Christopher Taber
  7. The First Fifty Years of Modern Econometrics By Christopher L. Gilbert; Duo Qin
  8. An Adaptive Version for the Metropolis Adjusted Langevin Algorithm with a Truncated Drift By Yves Atchade

  1. By: Michael W. Brandt (Department of Finance, University of Pennsylvania, and NBER); Francis X. Diebold (Departments of Economics, Finance and Statistics, University of Pennsylvania, and NBER)
    Abstract: We extend the important idea of range-based volatility estimation to the multivariate case. In particular, we propose a range-based covariance estimator that is motivated by financial economic considerations (the absence of arbitrage), in addition to statistical considerations. We show that, unlike other univariate and multivariate volatility estimators, the range-based estimator is highly efficient yet robust to market microstructure noise arising from bid-ask bounce and asynchronous trading. Finally, we provide an empirical example illustrating the value of the high-frequency sample path information contained in the range-based estimates in a multivariate GARCH framework.
    Keywords: Range-based estimation, volatility, covariance, correlation, absence of arbitrage, exchange rates, stock returns, bond returns, bid-ask bounce, asynchronous trading
    Date: 2004–01–07
    URL: http://d.repec.org/n?u=RePEc:cfs:cfswop:wp200407&r=ecm
  2. By: Peter F. Christoffersen (McGill University and CIRANO); Francis X. Diebold (University of Pennsylvania and NBER)
    Abstract: We consider three sets of phenomena that feature prominently – and separately – in the financial economics literature: conditional mean dependence (or lack thereof) in asset returns, dependence (and hence forecastability) in asset return signs, and dependence (and hence forecastability) in asset return volatilities. We show that they are very much interrelated, and we explore the relationships in detail. Among other things, we show that: (a) Volatility dependence produces sign dependence, so long as expected returns are nonzero, so that one should expect sign dependence, given the overwhelming evidence of volatility dependence; (b) The standard finding of little or no conditional mean dependence is entirely consistent with a significant degree of sign dependence and volatility dependence; (c) Sign dependence is not likely to be found via analysis of sign autocorrelations, runs tests, or traditional market timing tests, because of the special nonlinear nature of sign dependence; (d) Sign dependence is not likely to be found in very high-frequency (e.g., daily) or very low-frequency (e.g., annual) returns; instead, it is more likely to be found at intermediate return horizons; (e) Sign dependence is very much present in actual U.S. equity returns, and its properties match closely our theoretical predictions; (f) The link between volatility forecastability and sign forecastability remains intact in conditionally non-Gaussian environments, as for example with time-varying conditional skewness and/or kurtosis.
    Date: 2004–01–08
    URL: http://d.repec.org/n?u=RePEc:cfs:cfswop:wp200408&r=ecm
  3. By: Anat Bracha; Jeremy Gray (Dept. of Psychology, Yale University); Rustam Ibragimov; Boaz Nadler (Dept. of Mathematics, Yale University); Dmitry Shapiro; Glena Ames (Cowles Foundation, Yale University); Donald J. Brown (Cowles Foundation, Yale University)
    Abstract: This paper proposes nonparametric statistical procedures for analyzing discrete choice models of affective decision making. We make two contributions to the literature on behavioral economics. Namely, we propose a procedure for eliciting the existence of a Nash equilibrium in an intrapersonal, potential game as well as randomized sign tests for dependent observations on game-theoretic models of affective decision making. This methodology is illustrated in the context of a hypothetical experiment -- the Casino Game.
    Keywords: Behavioral economics, Affective decision making, Intrapersonal potential games, Randomized sign tests, Dependent observations, Adapted sequences, Martingale-difference sequences
    JEL: C12 C32 C35 C72 C91 D11 D81
    Date: 2005–06
    URL: http://d.repec.org/n?u=RePEc:cwl:cwldpp:1526&r=ecm
  4. By: Daiji Kawaguchi; Hisahiro Naito
    Abstract: We propose an efficient moment estimator for the probit model with a continuous endogenous regressor. The estimation can be readily implemented using a standard statistical package that can estimate a non-linear system two-stage least squares (instrumental variable) estimator.
    Keywords: Probit, Continuous endogenous regressor, Moment estimation
    JEL: C25
    Date: 2005–06
    URL: http://d.repec.org/n?u=RePEc:hst:hstdps:d05-106&r=ecm
  5. By: Roy Cerqueti and Mauro Costantini
    Abstract: This paper provides a further generalization of co-integration tests in a nonparametric setting. We adopt Bierens' approach in order to give an extension for processes I(d), with a fixed integer d. A generalized eigenvalue problem is solved, and the test statistics involved are obtained starting from two matrices that are independent on the data generating process. The mathematical tools we adopt are related to the asymptotic theory of the stochastic processes. The key point of our work is linked to the distinguishing between the stationary and non-stationary part of an integrated process.
    Keywords: Multivariate analysis, Nonparametric methods, Co-integration, Asymptotic properties.
    JEL: C14 C32
    Date: 2005–07–12
    URL: http://d.repec.org/n?u=RePEc:mol:ecsdps:esdp05026&r=ecm
  6. By: Timothy Conley; Christopher Taber
    Abstract: Difference in differences methods have become very popular in applied work. This paper provides a new method for inference in these models when there are a small number of policy changes. This situation occurs in many implementations of these estimators. Identification of the key parameter typically arises when a group "changes" some particular policy. The asymptotic approximations that are typically employed assume that the number of cross sectional groups, N, times the number of time periods, T, is large. However, even when N or T is large, the number of actual policy changes observed in the data is often very small. In this case, we argue that point estimators of treatment effects should not be thought of as being consistent and that the standard methods that researchers use to perform inference in these models are not appropriate. We develop an alternative approach to inference under the assumption that there are a finite number of policy changes in the data, using asymptotic approximations as the number of non-changing groups gets large. In this situation we cannot obtain a consistent point estimator for the key treatment effect parameter. However, we can consistently estimate the finite-sample distribution of the treatment effect estimator, up to the unknown parameter itself. This allows us to perform hypothesis tests and construct confidence intervals. For expositional and motivational purposes, we focus on the difference in differences case, but our approach should be appropriate more generally in treatment effect models which employ a large number of controls, but a small number of treatments. We demonstrate the use of the approach by analyzing the effect of college merit aide programs on college attendance. We show that in some cases the standard approach can give misleading results.
    Date: 2005–07
    URL: http://d.repec.org/n?u=RePEc:nbr:nberte:0312&r=ecm
  7. By: Christopher L. Gilbert (Università degli Studi di Trento); Duo Qin (Queen Mary, University of London)
    Abstract: We characterize modern econometrics in terms of the emergence a widely accepted analytical framework. A major theme which dominated much of the debate through the century was whether and how econometric models can reflect theory-generated economic structures. In the period prior to the 2nd world war, economists adopted a wide variety of analytical methods, some ad hoc but others reflecting advances in statistical methodology. Business cycle analysis and demand analysis were the two major areas in which statistical theory was employed. Methods became increasingly formalized but problems of data adequacy, estimation and identification were not always well distinguished. During and immediately after the war, Cowles Commission research sought to base econometrics on autonomous probabilistic models specified in terms of underlying structural parameters. Least squares would not normally be consistent in such models and maximum likelihood estimation was to be preferred. Subsequently, however, the pendulum swung back towards least squares-based methods and this was reflected ion the textbook expositions of what was accepted as standard econometrics in the late sixties and early seventies. Subsequently, the paradigm was undermined by the challenges imposed by rational expectations modelling, which challenged standard identification assumptions, and by the poor forecasting performance of many macroeconomic models by comparison with black box time series competitors. The result was a revival of non-structural modelling, particularly in the analysis of macroeconomic data.
    Keywords: Econometrics, History, Estimation, Identification.
    JEL: B23 C10
    Date: 2005–07
    URL: http://d.repec.org/n?u=RePEc:qmw:qmwecw:wp544&r=ecm
  8. By: Yves Atchade (Department of Mathematics and Statistics, University of Ottawa and LRSP)
    Abstract: This paper proposes an adaptive version for the Metropolis adjusted Langevin algorithm with a truncated drift (T-MALA). The scale parameter and the covariance matrix of the proposal kernel of the algorithm are simultaneously and recursively updated in order to reach the optimal acceptance rate of 0:574 (see Roberts and Rosenthal (2001)) and to estimate and use the correlation structure of the target distribution. We develop some convergence results for the algorithm. A simulation example is presented.
    Keywords: Markov Chain Monte Carlo, Stochastic approximation algorithms, Metropolis Adjusted Langevin algorithm, geometric rate of convergence.
    JEL: C10 C40
    Date: 2005–03–01
    URL: http://d.repec.org/n?u=RePEc:pqs:wpaper:0272005&r=ecm

This nep-ecm issue is ©2005 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.