nep-ecm New Economics Papers
on Econometrics
Issue of 2005‒06‒05
thirteen papers chosen by
Sune Karlsson
Orebro University

  1. A note on the asymptotic efficiency of the restricted estimation By José A. Hernández
  2. The endogeneity bias in the relation between cost-of-debt capital and corporate disclosure policy By Nikolaev,Valeri; Lent,Laurence van
  3. Measuring conditional segregation: methods and empirical examples By Åslund, Olof; Nordström Skans, Oskar
  4. On the Bimodality of the Exact Distribution of the TSLS Estimator By Giovanni Forchini
  5. Assessing the Magnitude of the Concentration Parameter in a Simultaneous Equations Model By D. S. Poskitt; C. L. Skeels
  6. Forecasting Accuracy and Estimation Uncertainty Using VAR Models with Short- and Long-Term Economic Restrictions: A Monte-Carlo Study By Osmani Teixeira de Carvalho Guillén; João Victor Issler; George Athanasopoulos
  7. Ultra High Frequency Volatility Estimation with Dependent Microstructure Noise By Yacine Ait-Sahalia; Per A. Mykland; Lan Zhang
  8. The Empirical Trap of Sign Reversals with Equality Restrictions By Stephen E. Haynes
  9. UNBALANCED COINTEGRATION By Javier Hualde
  10. ON THE ASYMPTOTIC POWER PROPERTIES OF SPECIFICATION TESTS FOR DYNAMIC PARAMETRIC REGRESSIONS By J. Carlos Escanciano
  11. Bayesian Methods for Improving Credit Scoring Models By Posch Peter N.; Loeffler Gunter; Schoene Christiane
  12. Control of Generalized Error Rates in Multiple Testing By Joseph P. Romano; Michael Wolf
  13. Improving Willingness to Pay Estimates for Quality Improvements through Joint Estimation with Quality Perceptions By John C. Whitehead

  1. By: José A. Hernández (University of Las Palmas de Gran Canaria; Facultad de CC. EE y EE. Despacho D312; Campus de Tafira; C/ Saulo Torón 4; 35017; Las Palmas de G.C. Spain Tfno (0034) 928458206, Fax (0034) 928458183)
    Abstract: This paper provides a unified framework for the analysis of the stochastic and deterministic constrained estimation. In a general framework it is show that stochastic restrictions method estimates can be asymptotically more e.cient than estimates ignoring prior information, and can achieve efficiency of the restricted estimate if prior information grows faster than the sample information in the asymptotics. As an example of the applicability of the previous result, the maximum likelihood stochastically restricted criterion is provided.
    Keywords: Prior information, stochastic restrictions, efficiency, maximum likelihood.
    Date: 2005–01
    URL: http://d.repec.org/n?u=RePEc:can:series:2005-01&r=ecm
  2. By: Nikolaev,Valeri; Lent,Laurence van (Tilburg University, Center for Economic Research)
    Abstract: The purpose of this paper is twofold. First, we provide a discussion of the problems associated with endogeneity in empirical accounting research. We emphasize problems arising when endogeneity is caused by (1) unobservable firm specific factors and (2) omitted variables and discuss the merits and drawbacks of using panel data techniques to address these causes. Second, we investigate the magnitude of endogeneity bias in Ordinary Least Squares regressions of cost-of-debt capital on firm disclosure policy. We document how including a set of variables which theory suggests to be related with both cost-of-debt capital and disclosure and using fixed effects estimation in a panel dataset reduces the endogeneity bias and produces consistent results. This analysis reveals that the effect of disclosure policy on cost-of-debt capital is 200% higher than what is found in Ordinary Least Squares estimation. Finally, we provide direct evidence that disclosure is impacted by unobservable firm-specific factors that are also correlated with cost-of-capital.
    JEL: M41 G3 C23
    Date: 2005
    URL: http://d.repec.org/n?u=RePEc:dgr:kubcen:200567&r=ecm
  3. By: Åslund, Olof (IFAU - Institute for Labour Market Policy Evaluation); Nordström Skans, Oskar (IFAU - Institute for Labour Market Policy Evaluation)
    Abstract: In empirical studies of segregation it is often desirable to quantify segregation that cannot be explained by underlying characteristics. To this end, we propose a fully non-parametric method for accounting for covariates in any measure of segregation. The basic idea is that given a set of discrete characteristics, there is a certain probability that a person belongs to a particular group, which can be used to compute an expected level of segregation. We also demonstrate that a modified index of exposure has both favorable analytical features and interpre-tational advantages in such settings. The methods are illustrated by an applica-tion to ethnic workplace segregation in Sweden. We also show how one can use a measure of exposure to study the earnings consequences of segregation stemming from different sources.
    Keywords: Exposure; covariates; ethnic workplace segregation
    JEL: C15 J15 J42
    Date: 2005–05–19
    URL: http://d.repec.org/n?u=RePEc:hhs:ifauwp:2005_012&r=ecm
  4. By: Giovanni Forchini
    Abstract: Nelson and Startz (Econometrica, 58, 1990), Maddala and Jong (Econometrica, 60, 1992) and Wolgrom (Econometrica, 69, 2001) have shown that the density of the two-stage least squares estimator may be bimodal in a just identified structural equation. This paper further investigates the conditions under which bimodality may arise in a just over-identified model.
    Keywords: Bimodality, Identification, Structural equation, Two Stage Least Squares.
    JEL: C30
    Date: 2005–05
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2005-14&r=ecm
  5. By: D. S. Poskitt; C. L. Skeels
    Abstract: Poskitt and Skeels (2003) provide a new approximation to the sampling distribution of the IV estimator in a simultaneous equations model. This approximation is appropriate when the concentration parameter associated with the reduced form model is small and a basic purpose of this paper is to provide the practitioner with a method of ascertaining when the concentration parameter is small, and hence when the use of the Poskitt and Skeels (2003) approximation is appropriate. Existing procedures tend to focus on the notion of correlation and hypothesis testing. Approaching the problem from a different perspective leads us to advocate a different statistic for use in this problem. We provide exact and approximate distribution theory for the proposed statistic and show that it satisfies various optimality criteria not satisfied by some of its competitors. Rather than adopting a testing approach we suggest the use of p-values as a calibration device.
    Keywords: Concentration parameter, simultaneous equations model, alienation coefficient, Wilks-lambda distribution, admissible invariant test.
    JEL: C12 C39 C52
    Date: 2004–12
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2004-29&r=ecm
  6. By: Osmani Teixeira de Carvalho Guillén; João Victor Issler; George Athanasopoulos
    Abstract: Using vector autoregressive (VAR) models and Monte-Carlo simulation methods we investigate the potential gains for forecasting accuracy and estimation uncertainty of two commonly used restrictions arising from economic relationships. The first reduces parameter space by imposing long-term restrictions on the behavior of economic variables as discussed by the literature on cointegration, and the second reduces parameter space by imposing short-term restrictions as discussed by the literature on serial-correlation common features (SCCF). Our simulations cover three important issues on model building, estimation, and forecasting. First, we examine the performance of standard and modified information criteria in choosing lag length for cointegrated VARs with SCCF restrictions. Second, we provide a comparison of forecasting accuracy of .fitted VARs when only cointegration restrictions are imposed and when cointegration and SCCF restrictions are jointly imposed. Third, we propose a new estimation algorithm where short- and long-term restrictions interact to estimate the cointegrating and the cofeature spaces respectively. We have three basic results. First, ignoring SCCF restrictions has a high cost in terms of model selection, because standard information criteria chooses too frequently inconsistent models, with too small a lag length. Criteria selecting lag and rank simultaneously have a superior performance in this case. Second, this translates into a superior forecasting performance of the restricted VECM over the VECM, with important improvements in forecasting accuracy .reaching more than 100% in extreme cases. Third, the new algorithm proposed here fares very well in terms of parameter estimation, even when we consider the estimation of long-term parameters, opening up the discussion of joint estimation of short- and long-term parameters in VAR models.
    Keywords: Reduced rank models, model selection criteria, forecasting accuracy.
    JEL: C32 C53
    Date: 2005–05
    URL: http://d.repec.org/n?u=RePEc:msh:ebswps:2005-15&r=ecm
  7. By: Yacine Ait-Sahalia; Per A. Mykland; Lan Zhang
    Abstract: We analyze the impact of time series dependence in market microstructure noise on the properties of estimators of the integrated volatility of an asset price based on data sampled at frequencies high enough for that noise to be a dominant consideration. We show that combining two time scales for that purpose will work even when the noise exhibits time series dependence, analyze in that context a refinement of this approach based on multiple time scales, and compare empirically our different estimators to the standard realized volatility.
    JEL: G12 C22
    Date: 2005–05
    URL: http://d.repec.org/n?u=RePEc:nbr:nberwo:11380&r=ecm
  8. By: Stephen E. Haynes (Department of Economics, University of Oregon)
    Abstract: This note explores the insidious empirical trap posed by two common equality restrictions in regression analysis. The trap is that restricted coefficients can lie outside the interval of unrestricted coefficients and even reverse sign when negatively correlated regressors are added to one another or when positively correlated regressors are subtracted from one another.
    Keywords: Equality restrictions, Sign reversals, Invalid restrictions
    JEL: C12 C52
    Date: 2005–01–15
    URL: http://d.repec.org/n?u=RePEc:ore:uoecwp:2005-8&r=ecm
  9. By: Javier Hualde (School of Economics and Business Administration, University of Navarra)
    Abstract: Recently, increasing interest on the issue of fractional cointegration has emerged from theoretical and empirical viewpoints. Here, as opposite to the traditional prescription of unit root observables with weak dependent cointegrating errors, the orders of integration of these series are allowed to take real values, but, as in the traditional framework, equality of the orders of at least two observable series is necessary for cointegration. This assumption, in view of the real-valued nature of these orders could pose some difficulties, and in the present paper we explore some ideas related to this issue in a simple bivariate framework. First, in a situation of "nearcointegration", where the only difference with respect to the "usual" fractional cointegration is that the orders of the two observable series differ in an asymptotically negligible way, we analyse properties of standard estimates of the cointegrating parameter. Second, we discuss the estimation of the cointegrating parameter in a situation where the orders of integration of the two observables are truly different, but their corresponding balanced versions (with same order of integration) are cointegrated in the usual sense. A Monte Carlo study of finitesample performance and simulated series is included.
    JEL: C22
    Date: 2005–05
    URL: http://d.repec.org/n?u=RePEc:una:unccee:wp0605&r=ecm
  10. By: J. Carlos Escanciano (School of Economics and Business Administration, University of Navarra)
    Abstract: Economic theories in dynamic contexts usually impose certain restrictions on the conditional mean of the underlying economic variables. Omnibus specification tests are the primary tools to test such restrictions when there is no information on the possible alternative. In this paper we study in detail the power properties of a large class of omnibus specification tests for parametric conditional means under time series processes. We show that all omnibus specification tests have a preference for a finite-dimensional space of alternatives (usually unknown to the practitioner) and we characterize such space for Cramér-von Mises (CvM) tests. This fact motivates the use of optimal tests against such preferred spaces instead of the omnibus tests. We proposed new asymptotically optimal directional and smooth tests that are optimally designed for cases in which a finite-dimensional space of alternatives is in mind. The new proposed optimal procedures are asymptotically distribution-free and are valid under weak assumptions on the underlying data generating process. In particular, they are valid under possibly time varying higher conditional moments of unknown form, e.g., conditional heteroskedasticity. A Monte Carlo experiment shows that previous asymptotic results provide good approximations in small sample sizes. Finally, an application of our theory to test the martingale difference hypothesis of some exchange rates provides new information on the rejection of omnibus tests and illustrates the relevance of our results for practitioners.
    JEL: C12 C14 C52
    Date: 2005–05
    URL: http://d.repec.org/n?u=RePEc:una:unccee:wp0705&r=ecm
  11. By: Posch Peter N. (University of Ulm); Loeffler Gunter (University of Ulm); Schoene Christiane (University of Ulm)
    Abstract: We propose a Bayesian methodology that enables banks to improve their credit scoring models by imposing prior information. As prior information, we use coefficients from credit scoring models estimated on other data sets. Through simulations, we explore the default prediction power of three Bayesian estimators in three different scenarios and find that they perform better than standard maximum likelihood estimates. We recommend that banks consider Bayesian estimation for internal and regulatory default prediction models.
    Keywords: Credit Scoring, Bayesian Inference, Bankruptcy Prediction
    JEL: C11 G21 G33
    Date: 2005–05–31
    URL: http://d.repec.org/n?u=RePEc:wpa:wuwpfi:0505024&r=ecm
  12. By: Joseph P. Romano; Michael Wolf
    Abstract: Consider the problem of testing s hypotheses simultaneously. The usual approach to dealing with the multiplicity problem is to restrict attention to procedures that control the probability of even one false rejection, the familiar familywise error rate (FWER). In many applications, particularly if s is large, one might be willing to tolerate more than one false rejection if the number of such cases is controlled, thereby increasing the ability of the procedure to reject false null hypotheses One possibility is to replace control of the FWER by control of the probability of k or more false rejections, which is called the k-FWER. We derive both single-step and stepdown procedures that control the k-FWER in finite samples or asymptotically, depending on the situation. Lehmann and Romano (2005a) derive some exact methods for this purpose, which apply whenever p-values are available for individual tests; no assumptions are made on the joint dependence of the p-values. In contrast, we construct methods that implicitly take into account the dependence structure of the individual test statistics in order to further increase the ability to detect false null hypotheses. We also consider the false discovery proportion (FDP) defined as the number of false rejections divided by the total number of rejections (and defined to be 0 if there are no rejections). The false discovery rate proposed by Benjamini and Hochberg (1995) controls E(FDP).
    Keywords: Bootstrap, False Discovery Proportion, False Discovery Rate, Generalized Familywise Error Rates, Multiple Testing, Stepdown Procedure.
    JEL: E43
    URL: http://d.repec.org/n?u=RePEc:zur:iewwpx:245&r=ecm
  13. By: John C. Whitehead (Department of Economics, Appalachian State University)
    Abstract: Willingness to pay for quality change may depend on heterogeneous perceived quality levels. In these instances, contingent valuation studies should include measures of quality perceptions as covariates in the willingness to pay model in order to avoid omitted variable bias. Variation in quality perceptions across respondents leads to a potential endogeneity of quality perceptions. We address the potential for endogeneity bias using an instrumental variables approach in which a measure of quality perceptions is included as a determinant of willingness to pay and is simultaneously determined by various exogenous factors. The willingness to pay model is estimated jointly with quality perceptions allowing for correlation of the error terms. Using data on willingness to pay for water quality improvements in the Neuse River in North Carolina we reject exogeneity of perceived quality. Correcting for endogeneity improves the measurement of willingness to pay by differentiating willingness to pay among respondents with heterogeneous quality perceptions.
    Keywords: Willingness to Pay, Quality, Perceptions, Endogeneity
    JEL: Q51 Q53
    Date: 2003–12
    URL: http://d.repec.org/n?u=RePEc:apl:wpaper:05-09&r=ecm

This nep-ecm issue is ©2005 by Sune Karlsson. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at http://nep.repec.org. For comments please write to the director of NEP, Marco Novarese at <director@nep.repec.org>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.