nep-ecm New Economics Papers
on Econometrics
Issue of 2007‒08‒27
nine papers chosen by
Sune Karlsson
Orebro University

  1. A New Approach to Drawing States in State Space Models By William J. McCausland; Shirley Miller; Denis Pelletier
  2. Estimation of Tobit Type Censored Demand Systems: A Comparison of Estimators By Mikkel Barslund
  3. Wild Bootstrap Tests for IV Regression By Russell Davidson; James G. MacKinnon
  4. A review of instrumental variables estimation in the applied health sciences By Paul Grootendorst
  5. Forecasting realized volatility models:the benefits of bagging and nonlinear specifications By Eric Hillebrand; Marcelo Cunha Medeiros
  6. Financial contagion and tests using instrumental variables By Andreas Pick
  7. A Short Note on the Theme of Too Many Instruments By David Roodman
  8. Bounding ATE with ITT By Ito, Seiro
  9. The Anatomy of the Firm Size Distribution: The Evolution of its Variance and Skewness By Peter Huber; Michael Pfaffermayr

  1. By: William J. McCausland (Département de sciences économiques, Université de Montréal); Shirley Miller (Département de sciences économiques, Université de Montréal); Denis Pelletier (Department of Economics, North Carolina State University)
    Abstract: We introduce a new method for drawing state variables in Gaussian state space models from their conditional distribution given parameters and observations. Unlike standard methods, our method does not involve Kalman filtering. We show that for some important cases, our method is computationally more efficient than standard methods in the literature. We consider two applications of our method.
    Keywords: State space models, Stochastic volatility, Count data
    JEL: C11 C13 C15 C32 C63
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:ncs:wpaper:014&r=ecm
  2. By: Mikkel Barslund (Department of Economics, University of Copenhagen)
    Abstract: Recently a number of authors have suggested to estimate censored demand systems as a system of Tobit multivariate equations employing a Quasi Maximum Likelihood (QML) estimator based on bivariate Tobit models. In this paper I study the efficiency of this QML estimator relative to the asymptotically more efficient Simulated ML (SML) estimator in the context of a censored Almost Ideal demand system. Further, a simpler QML estimator based on the sum of univariate Tobit models is introduced. A Monte Carlo simulation comparing the three estimators is performed on three different sample sizes. The QML estimators perform well in the presence of moderate sized error correlation coefficients often found in empirical studies. With absolute larger correlation coefficients, the SML estimator is found to be superior. The paper lends support to the general use of the QML estimators and points towards the use of simple etimators for more general censored systems of equations.
    Keywords: censored demand system; Monte Carlo; quasi maximum likelihood; simulated maximum likelihood
    JEL: D12 C15 C34
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:kud:kuiedp:0716&r=ecm
  3. By: Russell Davidson (McGill University); James G. MacKinnon (Queen's University)
    Abstract: We propose a wild bootstrap procedure for linear regression models estimated by instrumental variables. Like other bootstrap procedures that we have proposed elsewhere, it uses efficient estimates of the reduced-form equation(s). Unlike them, it takes account of possible heteroskedasticity of unknown form. We apply this procedure to t tests, including heteroskedasticity-robust t tests, and provide simulation evidence that it works far better than older methods, such as the pairs bootstrap. We also show how to obtain reliable confidence intervals by inverting bootstrap tests. An empirical example illustrates the utility of these procedures.
    Keywords: Instrumental variables, two-stage least squares, wild bootstrap, pairs bootstrap, residual bootstrap, weak instruments, confidence intervals
    JEL: C12 C15 C30
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:qed:wpaper:1135&r=ecm
  4. By: Paul Grootendorst
    Abstract: Health scientists often use observational data to estimate treatment effects when controlled experiments are not feasible. A limitation of observational research is non-random selection of subjects into different treatments, potentially leading to selection bias. The 2 commonly used solutions to this problem – covariate adjustment and fully parametric models – are limited by strong and untestable assumptions. Instrumental variables estimation can be a viable alternative. In this paper, I review examples of the application of IV in the health and social sciences, I show how the IV estimator works, I discuss the factors that affect its performance, I review how the interpretation of the IV estimator changes when treatment effects vary by individual, and consider the application of IV to nonlinear models.
    Keywords: instrumental variables, treatment effects, health outcomes
    JEL: C31 I12
    Date: 2007–06
    URL: http://d.repec.org/n?u=RePEc:mcm:sedapp:215&r=ecm
  5. By: Eric Hillebrand (DEPARTMENT OF ECONOMICS, LOUISIANA STATE UNIVERSITY); Marcelo Cunha Medeiros (Department of Economics, PUC-Rio)
    Abstract: We forecast daily realized volatilities with linear and nonlinear models and evaluate the benefits of bootstrap aggregation (bagging) in producing more precise forecasts. We consider the linear autoregressive (AR) model, the Heterogeneous Autoregressive model (HAR), and a non-linear HAR model based on a neural network specification that allows for logistic transition effects (NNHAR). The models and the bagging schemes are applied to the realized volatility time series of the S&P500 index from 3-Jan-2000 through 30-Dec-2005. Our main findings are: (1) For the HAR model, bagging successfully averages over the randomness of variable selection; however, when the NN model is considered, there is no clear benefit from using bagging; (2) including past returns in the models improves the forecast precision; and (3) the NNHAR model outperforms the linear alternatives.
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:rio:texdis:547&r=ecm
  6. By: Andreas Pick
    Abstract: This paper considers empirical tests for the contagion of financial crises that address the endogeneity of contagion by using instrumental variable estimation techniques. Two complications in the application to contagion are that the regression model is potentially incoherent and that it contains a parameter that is not identified under the null of no contagion. Monte Carlo experiments suggest that their influence is small in practice with the notable exception of similar tests, where both size and power are affected. An application to stock market data for the UK, USA, and Japan shows that ignoring the endogeneity of contagion leads to highly significant contagion coefficients. However, tests for contagion that takes the endogeneity into account result in mixed evidence for financial contagion.
    Keywords: Financial crises; contagion; non-linear simultaneous equation models
    JEL: C33 G3
    Date: 2007–06
    URL: http://d.repec.org/n?u=RePEc:dnb:dnbwpp:139&r=ecm
  7. By: David Roodman
    Abstract: The “difference” and “system” generalized method of moments (GMM) estimators for dynamic panel models are growing steadily in popularity. The estimators are designed for panels with short time dimensions (T), and by default they generate instruments sets whose number grows quadratically in T. The dangers associated with having many instruments relative to observations are documented in the applied literature. The instruments can overfit endogenous variables, failing to expunge their endogenous components and biasing coefficient estimates. Meanwhile they can vitiate the Hansen J test for joint validity of those instruments, as well as the difference-in-Sargan/Hansen test for subsets of instruments. The weakness of these specification tests is a particular concern for system GMM, whose distinctive instruments are only valid under a non-trivial assumption. Judging by current practice, many researchers do not fully appreciate that popular implementations of these estimators can by default generate results that simultaneously are invalid yet appear valid. The potential for type I errors—false positives—is therefore substantial, especially after amplification by publication bias. This paper explains the risks and illustrates them with reference to two early applications of the estimators to economic growth, Forbes (2000) on income inequality and Levine, Loayza, and Beck (LLB, 2000) on financial sector development. Endogenous causation proves hard to rule out in both papers. Going forward, for results from these GMM estimators to be credible, researchers must report the instrument count and aggressively test estimates and specification test results for robustness to reductions in that count.
    Keywords: dynamic panel estimation, difference GMM, system GMM, Stata, Arellano-Bond, Blundell-Bond, generalized method of moments, autocorrelation, finance and growth, inequality and growth
    JEL: C23 G0 O40
    Date: 2007–08
    URL: http://d.repec.org/n?u=RePEc:cgd:wpaper:125&r=ecm
  8. By: Ito, Seiro
    Abstract: We propose the bounds on ATE using intention-to-treat (ITT) estimator when there are compliers/noncompliers in randomized trials. The bounds are given as ITT<ATE<CACE, where compliers’ average treatment effect (CACE) can be computed from ITT and complier ratio. We show that these bounds can be derived from two assumptions: (1) average treatment effect is greater with compliers than noncompliers or CACE>NACE, (2) noncompliers’ average treatment effect (NACE) is nonnegative. We give an example of poverty impacts of health insurance, and effects of adverse selection and moral hazard of health insurance.
    Keywords: Intention-to-treat (ITT) estimator, Compliers’ average treatment effect (CACE) estimator, ATE estimator, Bounds, Economics, Econometrics, India, ITT, CACE, ATE, G World,others
    JEL: C13 C93 D82 I11 O15
    Date: 2007–05
    URL: http://d.repec.org/n?u=RePEc:jet:dpaper:dpaper106&r=ecm
  9. By: Peter Huber (WIFO); Michael Pfaffermayr (WIFO)
    Abstract: The evolution of higher moments of the firm size distribution so far seems to be neglected in the empirical firm growth literature. Based on GMM estimates, this paper introduces simple Wald tests to investigate whether the firm size distribution converges in both the second and third central moment. Using a comprehensive sample of Austrian firms, the estimation results indicate a substantial reduction in both the second and third central moment for the younger age cohorts. This effect is much less pronounced for older firms. Across age cohorts one observes an increase in variance, while the third central moment tends to vanish.
    Keywords: Growth of firms, market concentration, moments of the firm size distribution, GMM estimation, Wald test
    Date: 2007–06–20
    URL: http://d.repec.org/n?u=RePEc:wfo:wpaper:y:2007:i:295&r=ecm

This nep-ecm issue is ©2007 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.