nep-ecm New Economics Papers
on Econometrics
Issue of 2005‒12‒20
twenty papers chosen by
Sune Karlsson
Orebro University

  1. Efficient information theoretic inference for conditional moment restrictions By Richard Smith
  2. On the Estimation and Inference of a Panel Cointegration Model with Cross-Sectional Dependence By Jushan Bai; Chihwa Kao
  3. GMM with many weak moment conditions By Whitney Newey; Frank Windmeijer
  4. A Note on Decompositions in Fixed Effects Models in the Presence of Time-Invariant Characteristics By Axel Heitmüller
  5. Local GEL methods for conditional moment restrictions By Richard Smith
  6. Simulation-Based Two-Step Estimation with Endogenous Regressors By Kamhon Kan; Chihwa Kao
  7. Business failure prediction: simple-intuitive models versus statistical models By Ooghe, H.; Spaenjers, C.; Pieter vandermoere
  8. Weak instruments and empirical likelihood: a discussion of the papers by DWK Andrews and JH Stock and Y Kitamura By Richard Smith
  9. Spatial Correlation Robust Inference with Errors in Location or Distance By Timothy Conley; Francesca Molinari
  10. The Myth of Long-Horizon Predictability By Jacob Boudoukh; Matthew Richardson; Robert Whitelaw
  11. Estimating a Semi-Parametric Duration Model without Specifying Heterogeneity By Erich Battistin; Jerry Hausman; Tiemen M. Woutersen
  12. Testing Models w/ multiple equilibria by quantile methods By Echenique, Federico; Komunjer, Ivana
  13. Testing the Markov property with ultra-high frequency financial data By Matos, Joao Amaro de; Fernandes, Marcelo
  14. The Derivation of the NPV Probability Distribution of Risky Investments with Autoregressive Cash Flows By Jean-Paul Paquin; Annick Lambert; Alain Charbonneau
  15. Identification of a competing risks model with unknown transformations of latent failure times By Simon Lee
  16. Experimental Designs for Environmental Valuation with Choice-Experiments: A Monte-Carlo Investigation By Silvia Ferrini; Riccardo Scarpa
  17. Reduced-Rank Identification of Structural Shocks in VARs By Yuriy Gorodnichenko
  18. Estimation of Spatial Weights Matrix in a Spatial Error Model, with an Application to Diffusion in Housing Demand By Arnab Bhattacharjee; Chris Jensen-Butler
  19. Best nonparametric bounds on demand responses By Richard Blundell; Martin Browning; Ian Crawford
  20. Probabilistically Sophisticated Multiple Priors. By Simon Grant; Atsushi Kajii

  1. By: Richard Smith (Institute for Fiscal Studies and University of Cambridge)
    Abstract: The generalized method of moments estimator may be substantially biased in finite samples, especially so when there are large numbers of unconditional moment conditions. This paper develops a class of first order equivalent semi-parametric efficient estimators and tests for conditional moment restrictions models based on a local or kernel-weighted version of the Cressie-Read power divergence family of discrepancies. This approach is similar in spirit to the empirical likelihood methods of Kitamura, Tripathi and Ahn (2004) and Tripathi and Kitamura (2003). These efficient local methods avoid the necessity of explicit estimation of the conditional Jacobian and variance matrices of the conditional moment restrictions and provide empirical conditional probabilities for the observations.
    Keywords: Conditional Moment Restrictions, Local Cressie-Read Minimum Discrepancy, GMM, Semi-Parametric Efficiency
    JEL: C12 C13 C14 C20 C30
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:14/05&r=ecm
  2. By: Jushan Bai (Department of Economics, New York University, New York, NY 10003, and Department of Economics, Tsinghua University, Beijing 10084, China); Chihwa Kao (Center for Policy Research, Maxwell School, Syracuse University, Syracuse, NY 13244-1020)
    Abstract: Most of the existing literature on panel data cointegration assumes cross-sectional independence, an assumption that is difficult to satisfy. This paper studies panel cointegration under cross-sectional dependence, which is characterized by a factor structure. We derive the limiting distribution of a fully modified estimator for the panel cointegrating coefficients. We also propose a continuous-updated fully modified (CUP-FM) estimator). Monte Carlo results show that the CUP-FM estimator has better small sample properties than the two-step FM (2S-FM) and OLS estimators.
    Keywords: panel data cointegration, cross-sectional independence, cross-sectional dependence, continuous updated fully modified (CUP-FM) estimator, Monte Carlo results, two-step FM (2S-FM) estimator, OLS estimator
    JEL: C13 C15 C23
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:75&r=ecm
  3. By: Whitney Newey (Institute for Fiscal Studies and Massachussets Institute of Technology); Frank Windmeijer (Institute for Fiscal Studies and University of Bristol)
    Abstract: Using many moment conditions can improve efficiency but makes the usual GMM inferences inaccurate. Two step GMM is biased. Generalized empirical likelihood (GEL) has smaller bias but the usual standard errors are too small. In this paper we use alternative asymptotics, based on many weak moment conditions, that addresses this problem. This asymptotics leads to improved approximations in overidentified models where the variance of the derivative of the moment conditions is large relative to the squared expected value of the moment conditions and identification is not too weak. We obtain an asymptotic variance for GEL that is larger than the usual one and give a "sandwich" estimator of it. In Monte Carlo examples we find that this variance estimator leads to a better Gaussian approximation to t-ratios in a range of cases. We also show that Kleibergen (2005) K statistic is valid under these asymptotics. We also compare these results with a jackknife GMM estimator, finding that GEL is asymptotically more efficient under many weak moments.
    Keywords: GMM, Continuous Updating, Many Moments, Variance Adjustment
    JEL: C12 C13 C23
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:18/05&r=ecm
  4. By: Axel Heitmüller (London Business School and IZA Bonn)
    Abstract: Though theoretically appealing and very popular amongst labour economists, the interpretation of the unexplained part of the Oaxaca (1973) decomposition as discrimination rather than an omitted variable problem in cross-section data has often been criticised. In this note it is shown that this problem extends also to panel data and moreover that in a fixed effects model including time invariant regressors omitted variables are a necessary and deliberate consequence. Monte Carlo simulation is used to show the extent of the bias. Special cases and practical implications are discussed.
    Keywords: decomposition, fixed effects, Monte Carlo study
    JEL: C1 C33
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:iza:izadps:dp1886&r=ecm
  5. By: Richard Smith (Institute for Fiscal Studies and University of Cambridge)
    Abstract: The principal purpose of this paper is to adapt to the conditional moment context the GEL unconditional moment methods described in Smith(1997, 2001) and Newey and Smith(2004). In particular we develop GEL estimators which achieve the semiparametric efficiency lower bound. The requisite GEL criteria are constructed by local smoothing and parallel the local semiparametric efficient EL method formulated by Kitamura, Tripathi and Ahn (2004) for conditional moment restrictions. A particular advantageof these efficient local methods is the avoidance of the necessity of providing explicit estimators for the Jacobian and conditional variance matrices. The class of local GEL estimators admits a number of alternative first order equivalent estimators such as local EL, local ET and local CUE as in the unconditional moment restrictions case. The paper also provides a local GEL criterion function test statistic for parametric restrictions.
    Keywords: Conditional Moment Restrictions, Local Generalized Empirical Likelihood, GMM, Semi-Parametric Efficiency
    JEL: C12 C13 C14 C20 C30
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:15/05&r=ecm
  6. By: Kamhon Kan (Institute of Economics, Academia Sinica, Taipei, Taiwan); Chihwa Kao (Center for Policy Research, Maxwell School, Syracuse University, Syracuse NY 13244-1020)
    Abstract: This paper considers models with latent/discrete endogenous regressors and presents a simulation-based two-step (STS) estimator. The endogeneity is corrected by adopting a simulation-based control function approacy. The first step consists of simulating the residuals of the reduced-form equation for endogenous regressors. The second step is a regression model (linear, latent or discrete) with the simulated residual as an additional regressor. In this paper we develop the asymptotic theory for the STS estimator and its rate of convergence.
    Keywords: simulation-based two-step (STS) estimator
    JEL: C13 C15
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:max:cprwps:76&r=ecm
  7. By: Ooghe, H.; Spaenjers, C.; Pieter vandermoere
    Abstract: We give an overview of the shortcomings of the most frequently used statistical techniques in failure prediction modelling. The statistical procedures that underpin the selection of variables and the determination of coefficients often lead to ‘overfitting’. We also see that the ‘expected signs’ of variables are sometimes neglected and that an underlying theoretical framework mostly does not exist. Based on the current knowledge of failing firms, we construct a new type of failure prediction models, namely ‘simple-intuitive models’. In these models, eight variables are first logit-transformed and then equally weighted. These models are tested on two broad validation samples (1 year prior to failure and 3 years prior to failure) of Belgian companies. The performance results of the best simple-intuitive model are comparable to those of less transparent and more complex statistical models.
    Date: 2005–12–15
    URL: http://d.repec.org/n?u=RePEc:vlg:vlgwps:2005-22&r=ecm
  8. By: Richard Smith (Institute for Fiscal Studies and University of Cambridge)
    Abstract: Initially this discussion briefly reviews the contributions of Andrews and Stock and Kitamura, henceforth A, S and K respectively. Because the breadth of material covered by AS and K is so vast, we concentrate only on a few topics. Generalized empirical likelihood (GEL) provides the focus for the discussion. By defining an appropriate set of nonlinear moment conditions, GEL estimation yields objects which mirror in an asymptotic sense those which form the basis of the exact theory in AS allowing the definition of asymptotically pivotal test statistics appropriate for weakly identified models, the acceptance regions of which may then be inverted to provide asymptotically valid con- fidence interval estimators for the parameters of interest. The general minimum distance approach of Corcoran (1998) which parallels the information theoretic development of EL in K is briefly reviewed. A new class of estimators mirroring Schennach (2004) is suggested which shares the same asymptotic bias properties of EL and possess a well-defined limit distribution under misspecification.
    Keywords: Empirical Likelihood, Generalized Empirical Likelihood, Weak Identification, Minimum Distance, Asymptotic Bias, Higher Order Efficiency, Misspecification
    JEL: C13 C30
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:13/05&r=ecm
  9. By: Timothy Conley (Institute for Fiscal Studies and University of Chicago); Francesca Molinari
    Abstract: This paper presents results from a Monte Carlo study concerning inference with spatially dependent data. It investigates the impact of location/distance measurement errors upon the accuracy of parametric and nonparametric estimators of asymptotic variances.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:10/05&r=ecm
  10. By: Jacob Boudoukh; Matthew Richardson; Robert Whitelaw
    Abstract: The prevailing view in finance is that the evidence for long-horizon stock return predictability is significantly stronger than that for short horizons. We show that for persistent regressors, a characteristic of most of the predictive variables used in the literature, the estimators are almost perfectly correlated across horizons under the null hypothesis of no predictability. For example, for the persistence levels of dividend yields, the analytical correlation is 99% between the 1- and 2-year horizon estimators and 94% between the 1- and 5-year horizons, due to the combined effects of overlapping returns and the persistence of the predictive variable. Common sampling error across equations leads to ordinary least squares coefficient estimates and R2s that are roughly proportional to the horizon under the null hypothesis. This is the precise pattern found in the data. The asymptotic theory is corroborated, and the analysis extended by extensive simulation evidence. We perform joint tests across horizons for a variety of explanatory variables, and provide an alternative view of the existing evidence.
    JEL: G12 G10 C32
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:11841&r=ecm
  11. By: Erich Battistin (Institute for Fiscal Studies); Jerry Hausman (Institute for Fiscal Studies and MIT); Tiemen M. Woutersen (Institute for Fiscal Studies and John Hopkins University)
    Abstract: This paper presents a new estimator for the mixed proportional hazard model that allows for a nonparametric baseline hazard and time-varying regressors. In particular, this paper allows for discrete measurement of the durations as happens often in practice.
    Date: 2005–08
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:11/05&r=ecm
  12. By: Echenique, Federico; Komunjer, Ivana
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:clt:sswopa:1244&r=ecm
  13. By: Matos, Joao Amaro de; Fernandes, Marcelo
    Abstract: This paper develops a framework to nonparametrically test whether discretevalued irregularly-spaced financial transactions data follow a Markov process. For that purpose, we consider a specific optional sampling in which a continuous-time Markov process is observed only when it crosses some discrete level. This framework is convenient for it accommodates not only the irregular spacing of transactions data, but also price discreteness. Under such an observation rule, the current price duration is independent of previous price durations given the current price realization. A simple nonparametric test then follows by examining whether this conditional independence property holds. Finally, we investigate whether or not bid-ask spreads follow Markov processes using transactions data from the New York Stock Exchange. The motivation lies on the fact that asymmetric information models of market microstructures predict that the Markov property does not hold for the bid-ask spread. The results are mixed in the sense that the Markov assumption is rejected for three out of the five stocks we have analyzed.
    Keywords: Bid-ask spread, nonparametric testing, price durations, Markov property, ultra-high frequency data
    JEL: C14 C52 G10 G19
    Date: 2004
    URL: http://d.repec.org/n?u=RePEc:unl:unlfep:wp462&r=ecm
  14. By: Jean-Paul Paquin (Département des sciences administratives, Université du Québec (Outaouais)); Annick Lambert (Département des sciences administratives, Université du Québec (Outaouais) et LRSP); Alain Charbonneau (Département d'informatique et d'ingénierie, Université du Québec (Outaouais))
    Abstract: This paper deals with the evaluation of risky capital investment projects when cash flows are serially dependent and conform either to a first-order or a second-order autoregressive stochastic stationary process. The authors demonstrate that the NPV probability distribution does not conform strictly to the CLT asymptotic Normal distribution properties. The only exception occurs when the discount rate is set to zero. Under such conditions, it is also demonstrated that the CLT’s limit property is not hampered when cash flows are serially dependent and obey a first-order autoregressive process. However and as soon as a positive discount rate is introduced into the NPV equation, then the CLT does not apply in a strict mathematical sense. In fact, the higher the investment project discount rate and the less the CLT would be applicable to the NPV probability distribution. This result is obtained even under the most favourable conditions of independent and identically distributed random cash flows. So, for practical purposes and given that cash flows are generally serially correlated, what maximum discount rate could be statistically supported by the CLT? This last question was explored through computer simulations followed by statistical testing. The analysis shows that, in practice and under realistic economic conditions, project managers are justified in invoking the CLT when estimating the project NPV probability distribution and assessing its financial risk.
    Keywords: Financial Engineering.
    JEL: G12 G13 G33
    Date: 2005–12–16
    URL: http://d.repec.org/n?u=RePEc:pqs:wpaper:0342005&r=ecm
  15. By: Simon Lee (Institute for Fiscal Studies and University College London)
    Abstract: This paper is concerned with identification of a competing risks model with unknown transformations of latent failure times. The model in this paper includes, as special cases, competing risks versions of proportional hazards, mixed proportional hazards, and accelerated failure time models. It is shown that covariate effects on latent failure times, cause-specific link functions, and the joint survivor function of the disturbance terms can be identified without relying on modelling the dependence between latent failure times parametrically nor using an exclusion restriction among covariates. As a result, the paper provides an identification result on the joint survivor function of the latent failure times conditional on covariates.
    Keywords: Competing risks model; Identification; Transformation model
    Date: 2005–11
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:17/05&r=ecm
  16. By: Silvia Ferrini (University of Siena); Riccardo Scarpa (University of Waikato.ac.nz)
    Abstract: We review the practice of experimental design in the environmental economics literature concerned with choice experiments. We then contrast this with advances in the field of experimental design and present a comparison of statistical efficiency across four different experimental designs evaluated by Monte Carlo experiments. Two different situations are envisaged. First, a correct a priori knowledge of the multinomial logit specification used to derive the design and then an incorrect one. The data generating process is based on estimates from data of a real choice experiment with which preference for rural landscape attributes were studied. Results indicate the D-optimal designs are promising, especially those based on Bayesian algorithms with informative prior. However, if good a priori information is lacking, and if there is strong uncertainty about the real data generating process - conditions which are quite common in environmental valuation - then practitioners might be better off with conventional fractional designs from linear models. Under misspecification, a design of this type produces less biased estimates than its competitors.
    Keywords: logit experimental design; efficiency; Monte Carlo choice experiments; non-market valuation
    JEL: C13 C15 C25 C99 Q26
    Date: 2005–12–13
    URL: http://d.repec.org/n?u=RePEc:wai:econwp:05/08&r=ecm
  17. By: Yuriy Gorodnichenko (University of Michigan)
    Abstract: This paper integrates imposing a factor structure on residuals in vector autoregressions (VARs) into structural VAR analysis. Identification, estimation and testing procedures are discussed. The paper applies this approach to the well-known problem of studying the effects of monetary policy in open economy VAR models. The use of factor structure in identifying structural shocks is shown to resolve three long-standing puzzles in VAR literature. First, the price level does not increase in response to a monetary tightening. Second, the exchange rate appreciates on impact and then gradually depreciates. Hence, no price level and exchange rate puzzles are found. Third, monetary policy shocks are much less volatile than suggested by standard VAR identification schemes. In addition, the paper suggests that the apparent weak contemporaneous cross-variable responses and strong own responses in structural VARs can be an artifact of identifying assumptions and vanish after imposing a factor structure on the shocks.
    Keywords: Vector autoregressions, identification, factor structure, monetary policy
    JEL: E52 C32
    Date: 2005–12–15
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpma:0512011&r=ecm
  18. By: Arnab Bhattacharjee; Chris Jensen-Butler
    Abstract: This paper proposes a methodology for estimation of spatial weights matrices which are consistent with a given or estimated pattern of spatial autocovariance. This approach is potentially useful for applications in urban, environmental, development, growth and other areas of economics where there is uncertainty regarding the nature or spatial (or cross-sectional) interaction between regions (or economic agents). The proposed methodology is applied to housing markets in England and Wales and several new hypotheses are advanced about the social and economic forces that determine spatial diffusion in housing demand.
    Keywords: Spatial econometrics; Spatial autocorrelation; Spatial weights matrix, Spatial error model; Housing demand
    JEL: C14 C15 C30 C31 R21 R31
    Date: 2005–12
    URL: http://d.repec.org/n?u=RePEc:san:crieff:0519&r=ecm
  19. By: Richard Blundell (Institute for Fiscal Studies and University College London); Martin Browning (Institute for Fiscal Studies and University of Copenhagen); Ian Crawford (Institute for Fiscal Studies and University of Surrey)
    Abstract: This paper uses revealed preference inequalities to provide tight nonparametric bounds on consumer responses to price changes. Price responses are allowed to vary nonparametrically across the income distribution by exploiting microdata on consumer expenditures and incomes over a finite set of discrete relative price changes. This is achieved by combining the theory of revealed preference with the semiparametric estimation of consumer expansion paths (Engel curves). We label these expansion path based bounds as E-bounds. Deviations from revealed preference restrictions aremeasured by preference perturbations which are shown to usefully characterise taste change.
    Keywords: Demand responses, relative prices, revealed preference, semiparametric regression, changing tastes
    JEL: D12 C14 C43
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:ifs:cemmap:12/05&r=ecm
  20. By: Simon Grant (Department of Economics, Rice University); Atsushi Kajii (Institute of Economic Research, Kyoto University)
    Abstract: We characterize the intersection of the probabilistically sophisticated and multiple prior models. We show this class is strictly larger than the subjective expected utility model and that its elements can be generated from a generalized class of the -contaminated priors, which we dub the - contaminated/ -truncated prior.
    Keywords: subjective probability, maximin expected utility, epsilon-contamined priors.
    JEL: D81
    Date: 2005–10
    URL: http://d.repec.org/n?u=RePEc:kyo:wpaper:608&r=ecm

This nep-ecm issue is ©2005 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.