nep-ecm New Economics Papers
on Econometrics
Issue of 2012‒12‒22
eleven papers chosen by
Sune Karlsson
Orebro University

  1. Estimation and Inference for Distribution Functions and Quantile Functions in Treatment Effect Models By Stephen G. Donald; Yu-Chin Hsu
  2. An Exponential Class of Dynamic Binary Choice Panel Data Models with Fixed Effects By Al-Sadoon, Majid M.; Li, Tong; Pesaran, M. Hashem
  3. Select the Valid and Relevant Moments: A One-Step Procedure for GMM with Many Moments By Xu Cheng; Zhipeng Liao
  4. The Generalized Lognormal Distribution and the Stieltjes Moment Problem By Christian Kleiber
  5. Copula Based Factorization in Bayesian Multivariate Infinite Mixture Models By Martin Burda; Artem Prokhorov
  6. Forecasting with Factor-Augmented Regression: A Frequentist Model Averaging Approach By Xu Cheng; Bruce E. Hansen
  7. Improving the Power of Tests of Stochastic Dominance By Stephen G. Donald; Yu-Chin Hsu
  8. Estimating Revenue Under Collusion-Proof Auctions By Gaurab Aryal; Maria F. Gabrielli
  9. Finding relevant variables in sparse Bayesian factor models: Economic applications and simulation results By Kaufmann, Sylvia; Schumacher, Christian
  10. Unconditional and Conditional Quantile Treatment Effect: Identification Strategies and Interpretations By M. Fort
  11. An improved theoretical ground for the linear feedback model and a new indicator By Yoshitsugu Kitazawa

  1. By: Stephen G. Donald (Department of Economics, University of Texas at Austin); Yu-Chin Hsu (Institute of Economics, Academia Sinica, Taipei, Taiwan)
    Abstract: We propose inverse probability weighted estimators for the distribution functions of the potential outcomes of a binary treatment under the unconfoundedness assumption. We also apply the inverse mapping on the distribution functions to obtain the quantile functions. We show that the proposed estimators converge weakly to zero mean Gaussian processes. A simulation method based on the multiplier central limit theorem is proposed to approximate these limiting Gaussian processes. The estimators in the treated subpopulation are shown to share the same properties. To demonstrate the usefulness of our results, we construct Kolmogorov-Smirnov type tests for stochastic dominance relations between the distributions of potential outcomes. We examine the finite sample properties of our tests in a set of Monte-Carlo simulations and use our tests in an empirical example which shows that a job training program had a positive effect on incomes.
    Keywords: Hypothesis testing, stochastic dominance, treatment effects, propensity score
    JEL: C01 C12 C21
    Date: 2012–12
  2. By: Al-Sadoon, Majid M. (Universitat Pompeu Fabra); Li, Tong (Vanderbilt University); Pesaran, M. Hashem (University of Cambridge)
    Abstract: This paper develops a model for dynamic binary choice panel data that allows for unobserved heterogeneity to be arbitrarily correlated with covariates. The model is of the exponential type. We derive moment conditions that enable us to eliminate the unobserved heterogeneity term and at the same time to identify the parameters of the model. We then propose GMM estimators that are consistent and asymptotically normally distributed at the root-N rate. We also study the conditional likelihood approach, which can only identify the effect of state dependence in our case. Monte Carlo experiments demonstrate the finite sample performance of our GMM estimators.
    Keywords: dynamic discrete choice, fixed effects, panel data, initial values, GMM, CMLE
    JEL: C23 C25
    Date: 2012–11
  3. By: Xu Cheng (Department of Economics, University of Pennsylvania); Zhipeng Liao (Department of Economics, University of California Los Angeles)
    Abstract: This paper considers the selection of valid and relevant moments for the generalized method of moments (GMM) estimation. For applications with many candidate moments, our asymptotic analysis ccommodates a diverging number of moments as the sample size increases. The proposed procedure achieves three objectives in one-step: (i) the valid and relevant moments are selected simultaneously rather than sequentially; (ii) all desired moments are selected together instead of in a stepwise manner; (iii) the parameter of interest is automatically estimated with all selected moments as opposed to a post-selection estimation. The new moment selection method is achieved via an information-based adaptive GMM shrinkage estimation, where an appropriate penalty is attached to the standard GMM criterion to link moment selection to shrinkage estimation. The penalty is designed to signal both moment validity and relevance for consistent moment selection and efficient estimation. The asymptotic analysis allows for non -smooth sample moments and weakly dependent observations, making it generally applicable. For practical implementation, this one-step procedure is computationally attractive.
    Keywords: Adaptive Penalty, GMM, Many Moments, Moment Selection, Oracle Properties, Shrinkage Estimation
    JEL: C12 C13 C36
    Date: 2012–11–26
  4. By: Christian Kleiber (University of Basel)
    Abstract: This paper studies a Stieltjes-type moment problem defined by the generalized lognormal distribution, a heavy-tailed distribution with applications in economics, finance and related fields. It arises as the distribution of the exponential of a random variable following a generalized error distribution, and hence figures prominently in the EGARCH model of asset price volatility. Compared to the classical lognormal distribution it has an additional shape parameter. It emerges that moment (in)determinacy depends on the value of this parameter: for some values, the distribution does not have finite moments of all orders, hence the moment problem is not of interest in these cases. For other values, the distribution has moments of all orders, yet it is moment-indeterminate. Finally, a limiting case is supported on a bounded interval, and hence determined by its moments. For those generalized lognormal distributions that are moment-indeterminate Stieltjes classes of moment-equivalent distributions are presented.
    Keywords: Generalized error distribution, generalized lognormal distribution, lognormal distribution, moment problem, size distribution, Stieltjes class, volatility model
    JEL: C46 C02
    Date: 2012
  5. By: Martin Burda (University of Toronto); Artem Prokhorov (Concordia University and CIREQ)
    Date: 2012–12
  6. By: Xu Cheng (Department of Economics, University of Pennsylvania); Bruce E. Hansen (Department of Economics, University of Wisconsin-Madison)
    Abstract: This paper considers forecast combination with factor-augmented regression. In this framework, a large number of forecasting models are available, varying by the choice of factors and the number of lags. We investigate forecast combination using weights that minimize the Mallows and the leave-h-out cross validation criteria. The unobserved factor regressors are estimated by principle components of a large panel with N predictors over T periods. With these generated regressors, we show that the Mallows and leave-h-out cross validation criteria are approximately unbiased estimators of the one-step-ahead and multi-step-ahead mean squared forecast errors, respectively, provided that N, T —› ∞. In contrast to well-known results in the literature, the generated-regressor issue can be ignored for forecast combination, without restrictions on the relation between N and T. Simulations show that the Mallows model averaging and leave-h-out cross-validation averaging methods yield lower mean squared forecast errors than alternative model selection and averaging methods such as AIC, BIC, cross validation, and Bayesian model averaging. We apply the proposed methods to the U.S. macroeconomic data set in Stock and Watson (2012) and find that they compare favorably to many popular shrinkage-type forecasting methods.
    Keywords: Cross-validation, factor models, forecast combination, generated regressors, Mallows
    JEL: C52 C53
    Date: 2012–10–01
  7. By: Stephen G. Donald (Department of Economics, University of Texas at Austin); Yu-Chin Hsu (Institute of Economics, Academia Sinica, Taipei, Taiwan)
    Abstract: We extend Hansen’s (2005) recentering method to a continuum of inequality constraints to construct new Kolmogorov-Smirnov tests for stochastic dominance of any pre-specified order when we have independent samples from two populations. We show that our tests can control the size asymptotically, are consistent against fixed alternatives and are unbiased against some N−1/2 local alternatives. It is shown that by avoiding the least favorable configuration, our tests are less conservative and more powerful than Barrett and Donald’s (2003) and that under some local alternatives, our tests are more powerful than the subsampling tests proposed by Linton, Maasoumi and Whang (2005). We show the uniformity of our test, and extend our tests to the higher order stochastic dominance cases and to the cases where there is dependence between two populations. Monte Carlo simulations support our theoretical findings. We apply our method to test the stochastic dominance relationships between the Canadian income distribution for 1978 and 1986 and find that our tests reject some cases which Barrett and Donald’s (2003) do not.
    Keywords: Stochastic dominance, test consistency, simulation, bootstrap, uniform inference
    JEL: G12 E44
    Date: 2012–12
  8. By: Gaurab Aryal; Maria F. Gabrielli
    Abstract: We propose a method to nonparametriclly estimate the revenue under a auction that is efficient and resilient to collusion [Chen and Micali, 2012]. Efficiency is achieved on account of a lower revenue and we propose a method to quantify this efficiency-revenue trade-off, i.e. the extra cost for which efficient allocation can be guaranteed even when bidders collude. We implement a local polynomial estimation method on sample of California highway procurements data and find that to achieve efficiency the cost of procurement must increase by at lest 10.8% and can go up to 48.8% depending on the size of bidding-ring.
    JEL: C14 C4 C7 D44 L4
    Date: 2012–12
  9. By: Kaufmann, Sylvia; Schumacher, Christian
    Abstract: This paper considers factor estimation from heterogenous data, where some of the variables are noisy and only weakly informative for the factors. To identify the irrelevant variables, we search for zero rows in the loadings matrix of the factor model. To sharply separate these irrelevant variables from the informative ones, we choose a Bayesian framework for factor estimation with sparse priors on the loadings matrix. The choice of a sparse prior is an extension to the existing macroeconomic literature, which predominantly uses normal priors on the loadings. Simulations show that the sparse factor model can well detect various degrees of sparsity in the data, and how irrelevant variables can be identified. Empirical applications to a large multi-country GDP dataset and disaggregated CPI inflation data for the US reveal that sparsity matters a lot, as the majority of the variables in both datasets are irrelevant for factor estimation. --
    Keywords: factor models,variable selection,sparse priors
    JEL: C38 C11
    Date: 2012
  10. By: M. Fort
    Abstract: This paper reviews strategies that allow one to identify the effects of policy interventions on the unconditional or conditional distribution of the outcome of interest. This distiction is irrelevant when one focuses on average treatment effects since identifying assumptions typically do not affect the parameter's interpretation. Conversely, finding the appropriate answer to a research question on the effects over the distribution requires particular attention in the choice of the identification strategy. Indeed, quantiles of the conditional and unconditional distribution of a random variable carry a different meaning even if identification of both these set of parameters may require conditioning on observed covariates.
    JEL: C18
    Date: 2012–12
  11. By: Yoshitsugu Kitazawa (Faculty of Economics, Kyushu Sangyo University)
    Abstract: This paper describes a lucid theoretical ground for the linear feedback model proposed by Blundell et al. (2002) and further proposes the indicator on the initial knowledge storage in the framework of the linear feedback model. The values of the indicator are calculated with the estimation results conducted by Blundell et al. (2002). Further, the GMM estimations of the linear feedback model are conducted by using the stationarity moment conditions customized to needs of count panel data, in order to calculate the values of the indicator.
    Keywords: linear feedback model, knowledge production, initial knowledge storage, patents-R&D relationship, GMM
    JEL: C23 C25 O30
    Date: 2012–12

This nep-ecm issue is ©2012 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.