Lock in $30 Savings on PRO—Offer Ends Soon! ⏳

The Trouble with Impact and other Stories

The Trouble with Impact and other Stories

Presentation given at ReCon (Research in the 21st Century) on June 19, 2015 in Edinburgh

Martin Fenner

June 19, 2015
Tweet

More Decks by Martin Fenner

Other Decks in Science

Transcript

  1. The Trouble with Impact and other Stories Martin Fenner PLOS

    Technical Lead Article-Level Metrics http://orcid.org/0000-0003-1419-2405 https://www.flickr.com/photos/bufivla/3619927485
  2. What makes a winner? 3 Picking music tastes of the

    audience Competitiveness of candidate selection Marketing (e.g. airtime) before the event Loyality of neighbouring countries
  3. 4 RESEARCH ARTICLE The Oligopoly of Academic Publishers in the

    Digital Era Vincent Larivière1,2*, Stefanie Haustein1, Philippe Mongeon1 1 École de bibliothéconomie et des sciences de l’information, Université de Montréal, C.P. 6128, Succ. Centre-Ville, Montréal, QC. H3C 3J7, Canada, 2 Observatoire des Sciences et des Technologies (OST), Centre Interuniversitaire de Recherche sur la Science et la Technologie (CIRST), Université du Québec à Montréal, CP 8888, Succ. Centre-Ville, Montréal, QC. H3C 3P8, Canada * [email protected] Abstract The consolidation of the scientific publishing industry has been the topic of much debate within and outside the scientific community, especially in relation to major publishers’ high profit margins. However, the share of scientific output published in the journals of these major publishers, as well as its evolution over time and across various disciplines, has not yet been analyzed. This paper provides such analysis, based on 45 million documents in- dexed in the Web of Science over the period 1973-2013. It shows that in both natural and medical sciences (NMS) and social sciences and humanities (SSH), Reed-Elsevier, Wiley- Blackwell, Springer, and Taylor & Francis increased their share of the published output, es- pecially since the advent of the digital era (mid-1990s). Combined, the top five most prolific publishers account for more than 50% of all papers published in 2013. Disciplines of the so- cial sciences have the highest level of concentration (70% of papers from the top five pub- lishers), while the humanities have remained relatively independent (20% from top five publishers). NMS disciplines are in between, mainly because of the strength of their scientif- ic societies, such as the ACS in chemistry or APS in physics. The paper also examines the a11111 OPEN ACCESS Citation: Larivière V, Haustein S, Mongeon P (2015) The Oligopoly of Academic Publishers in the Digital Era. PLoS ONE 10(6): e0127502. doi:10.1371/ journal.pone.0127502 Academic Editor: Wolfgang Glanzel, Katholieke Universiteit Leuven, BELGIUM Received: January 14, 2015 Accepted: March 24, 2015 Published: June 10, 2015 Copyright: © 2015 Larivière et al. This is an open http://doi.org/10.1371/journal.pone.0127502
  4. What makes a winner? 5 Picking mainstream research topics Recognition

    of researcher, institution and journal Networking in the community, e.g. at conferences
  5. Problems with the current state of impact assessment 15 Many

    serious gaps in both theory and practical application Unintended consequences
  6. Problems of Individual-Level Evaluative Bibliometrics 16 Don't reduce individual research

    performance to a single number Don't use impact factors as measure of quality for individual researchers Don't apply (hidden) bibliometric filters for selection, e.g. minimum IF for inclusion in publication lists Don't apply arbitrary weights to co-authorship. Algorithms bases on author position might be problematic … http://www.slideshare.net/paulwouters1/issi2013-wg-pw
  7. Unintended consequences 17 Researchers and other stakeholders adjust their behavior

    towards proxy indicators Both direct and indirect costs (time and money) increase Impact assessment becomes a self- sustained activity
  8. Unintended consequences 18 Tenure and promotion decisions are increasingly made

    by journal editors and administrators not peers The research and researchers that gets funded and promoted is heavily distorted by the system trying to assess its impact
  9. Be honest and responsible 21 For example: Don’t claim false

    sense of accuracy Be careful with aggregation
  10. Be open 22 As a provider of metrics data for

    research outputs Make your data openly available Make your code to generate the data openly available The above doesn’t preclude value-added services only available to customers
  11. Set Principles 23 The quality of research can’t be quantified

    Researchers can only be evaluated by their peers Proxy indicators are an important aid for evaluation by peers, but never a replacement Proxy indicators should not be used as an excuse for painful funding decisions
  12. Develop Best Practices 24 That are agreed upon in the

    community across stakeholders NISO is for example on best practices and standards for alternative assessment metrics
  13. Set a speed limit 25 Keep a healthy ratio of

    spending for research and spending for research assessment
  14. Be aware of the power of proxy indicators 26 Use

    them consciously for policy decisions, e.g. A funder wanting to promote research data publication An institution wanting to promote Open Access
  15. More research 27 For example: Better understanding of social media

    metrics data quality issues Appropriate indicators for social impact Research workflows instead of research outputs as indicators for impact Reproducibility as a possible metric
  16. More education 28 Naive approach to bibliometrics by many researchers

    who would never approach their own work like this
  17. This presentation is made available under a CC-BY 4.0 license.

    http://creativecommons.org/licenses/by/4.0/