nep-sog New Economics Papers
on Sociology of Economics
Issue of 2016‒04‒23
three papers chosen by
Jonas Holmström
Axventure AB

  1. Co-Authorship And Research Productivity In Economics: Assessing The Assortative Matching Hypothesis By Damien Besancenot; Kim Huynh; Francisco Serranito
  2. Bias against Novelty in Science: A Cautionary Tale for Users of Bibliometric Indicators By Stephan, Paula E; Veugelers, Reinhilde; Wang, Jian
  3. Perceptions and Practices of Replication by Social and Behavioral Scientists: Making Replications a Mandatory Element of Curricula Would Be Useful By Benedikt Fecher; Mathis Fräßdorf; Gert G. Wagner

  1. By: Damien Besancenot (CEPN and University Paris 13, Sorbonne Paris Cité); Kim Huynh (LEM and University Panthéon-Assas, Paris 2); Francisco Serranito (Université Orléans, CNRS, LEO, UMR 7322 and IRD, LEDa, DIAL UMR 225)
    Abstract: This paper estimates the relation between the size and quality of scientists’ co-author networks and individual characteristics (notably productivity) in the context of institutional changes in French academia in the mid-1980s. The analysis employs the Two-Stage Residual Inclusion (2SRI) framework to handle endogeneity in individual productivity relative to the quality of co-authors. Data is taken from a novel database of French academic economists. The main finding is that the size and quality of authors’ networks are positively related to their productivity; this is understood as evidence of assortative matching. Other effects on coauthor networks (life-cycles, specialities) are also identified.
    Keywords: Co-authorship, Count Data, Zero Inflate Models, Instrumental Variables, h index
    JEL: A14 C25 D83 I23 J24
    Date: 2016–03
  2. By: Stephan, Paula E; Veugelers, Reinhilde; Wang, Jian
    Abstract: Research which explores unchartered waters has a high potential for major impact but also carries a higher uncertainty of having impact. Such explorative research is often described as taking a novel approach. This study examines the complex relationship between pursuing a novel approach and impact. Viewing scientific research as a combinatorial process, we measure novelty in science by examining whether a published paper makes first time ever combinations of referenced journals, taking into account the difficulty of making such combinations. We apply this newly developed measure of novelty to all Web of Science research articles published in 2001 across all scientific disciplines. We find that highly novel papers, defined to be those that make more (distant) new combinations, deliver high gains to science: they are more likely to be a top 1% highly cited paper in the long run, to inspire follow on highly cited research, and to be cited in a broader set of disciplines. At the same time, novel research is also more risky, reflected by a higher variance in its citation performance. In addition, we find that novel research is significantly more highly cited in "foreign" fields but not in its "home" field. We also find strong evidence of delayed recognition of novel papers and that novel papers are less likely to be top cited when using a short time window. Finally, novel papers typically are published in journals with a lower than expected Impact Factor. These findings suggest that science policy, in particular funding decisions which rely on traditional bibliometric indicators based on short-term direct citation counts and Journal Impact Factors, may be biased against "high risk/high gain" novel research. The findings also caution against a mono-disciplinary approach in peer review to assess the true value of novel research.
    Keywords: novelty; science; impact; evaluation; bibliometrics
    JEL: I23
    Date: 2016–04
  3. By: Benedikt Fecher; Mathis Fräßdorf; Gert G. Wagner
    Abstract: We live in a time of increasing publication rates and specialization of scientific disciplines. More and more, the research community is facing the challenge of assuring the quality of research and maintaining trust in the scientific enterprise. Replication studies are necessary to detect erroneous research. Thus, the replicability of research is considered a hallmark of good scientific practice and it has lately become a key concern for research communities and science policy makers alike. In this case study we analyze perceptions and practices regarding replication studies in the social and behavioral sciences. Our analyses are based on a survey of almost 300 researchers that use data from the German Socio-Economic Panel Study (SOEP), a multidisciplinary longitudinal multicohort study. We find that more than two thirds of respondents disagree with the statement that replications are not worthwhile, because major mistakes will be found at some point anyway. Nevertheless, most respondents are not willing to spend their time to conduct replication studies. This situation can be characterized as a “tragedy of the commons”: everybody knows that replications are useful, but almost everybody counts on others to conduct them. Our most important finding concerning practical consequences is that among the few replications that are reported, a large majority is conducted in the context of teaching. In our view, this is a promising detail: in order to foster replicability, one avenue may be to make replication studies a mandatory part of curricula as well as of doctoral theses. Furthermore, we argue that replication studies need to be more attractive for researchers. For example, successful replications could be listed in the publication lists of replicated authors. Vice versa, data sharing needs to receive more recognition, for example by considering data production and subsequent data sharing as scientific output.
    Keywords: Replicability, good science, data sharing, research policy
    JEL: A13 A20 C81 D02 D23
    Date: 2016

This nep-sog issue is ©2016 by Jonas Holmström. It is provided as is without any express or implied warranty. It may be freely redistributed in whole or in part for any purpose. If distributed in part, please include this notice.
General information on the NEP project can be found at For comments please write to the director of NEP, Marco Novarese at <>. Put “NEP” in the subject, otherwise your mail may be rejected.
NEP’s infrastructure is sponsored by the School of Economics and Finance of Massey University in New Zealand.