Upgrade to Pro — share decks privately, control downloads, hide ads and more …

The Trouble with Impact and other Stories

The Trouble with Impact and other Stories

Presentation given at ReCon (Research in the 21st Century) on June 19, 2015 in Edinburgh

Martin Fenner

June 19, 2015
Tweet

More Decks by Martin Fenner

Other Decks in Science

Transcript

  1. The Trouble with Impact and
    other Stories
    Martin Fenner
    PLOS Technical Lead Article-Level Metrics
    http://orcid.org/0000-0003-1419-2405
    https://www.flickr.com/photos/bufivla/3619927485

    View Slide

  2. 2
    #ESC2015
    https://www.flickr.com/photos/andersabrahamsson/16487813070

    View Slide

  3. What makes a winner?
    3
    Picking music tastes of the audience
    Competitiveness of candidate selection
    Marketing (e.g. airtime) before the event
    Loyality of neighbouring countries

    View Slide

  4. 4
    RESEARCH ARTICLE
    The Oligopoly of Academic Publishers in the
    Digital Era
    Vincent Larivière1,2*, Stefanie Haustein1, Philippe Mongeon1
    1 École de bibliothéconomie et des sciences de l’information, Université de Montréal, C.P. 6128, Succ.
    Centre-Ville, Montréal, QC. H3C 3J7, Canada, 2 Observatoire des Sciences et des Technologies (OST),
    Centre Interuniversitaire de Recherche sur la Science et la Technologie (CIRST), Université du Québec à
    Montréal, CP 8888, Succ. Centre-Ville, Montréal, QC. H3C 3P8, Canada
    * [email protected]
    Abstract
    The consolidation of the scientific publishing industry has been the topic of much debate
    within and outside the scientific community, especially in relation to major publishers’ high
    profit margins. However, the share of scientific output published in the journals of these
    major publishers, as well as its evolution over time and across various disciplines, has not
    yet been analyzed. This paper provides such analysis, based on 45 million documents in-
    dexed in the Web of Science over the period 1973-2013. It shows that in both natural and
    medical sciences (NMS) and social sciences and humanities (SSH), Reed-Elsevier, Wiley-
    Blackwell, Springer, and Taylor & Francis increased their share of the published output, es-
    pecially since the advent of the digital era (mid-1990s). Combined, the top five most prolific
    publishers account for more than 50% of all papers published in 2013. Disciplines of the so-
    cial sciences have the highest level of concentration (70% of papers from the top five pub-
    lishers), while the humanities have remained relatively independent (20% from top five
    publishers). NMS disciplines are in between, mainly because of the strength of their scientif-
    ic societies, such as the ACS in chemistry or APS in physics. The paper also examines the
    a11111
    OPEN ACCESS
    Citation: Larivière V, Haustein S, Mongeon P (2015)
    The Oligopoly of Academic Publishers in the Digital
    Era. PLoS ONE 10(6): e0127502. doi:10.1371/
    journal.pone.0127502
    Academic Editor: Wolfgang Glanzel, Katholieke
    Universiteit Leuven, BELGIUM
    Received: January 14, 2015
    Accepted: March 24, 2015
    Published: June 10, 2015
    Copyright: © 2015 Larivière et al. This is an open
    http://doi.org/10.1371/journal.pone.0127502

    View Slide

  5. What makes a winner?
    5
    Picking mainstream research topics
    Recognition of researcher, institution and
    journal
    Networking in the community, e.g. at
    conferences

    View Slide

  6. Quality of research can’t be
    quantified…
    6

    View Slide

  7. … so we are using proxies
    instead
    7

    View Slide

  8. Journal articles instead of all
    research outputs, or the
    research itself
    8

    View Slide

  9. Scientific impact, i.e.
    popularity among peers,
    instead of quality
    9

    View Slide

  10. Citations as a proxy for
    popularity among peers
    10

    View Slide

  11. Citations in selected journals
    as a proxy for citations
    11

    View Slide

  12. Social media metrics as a
    proxy for social impact
    12

    View Slide

  13. Mean citations per journal
    instead of citations for
    individual article
    13

    View Slide

  14. 14
    https://en.wikipedia.org/wiki/File:Snoopy_wwi_ace_lb.jpg
    https://www.flickr.com/photos/alainpicard/4175215815
    In the end we describe
    something very different

    View Slide

  15. Problems with the current
    state of impact assessment
    15
    Many serious gaps in both theory and
    practical application
    Unintended consequences

    View Slide

  16. Problems of Individual-Level
    Evaluative Bibliometrics
    16
    Don't reduce individual research performance to a
    single number
    Don't use impact factors as measure of quality for
    individual researchers
    Don't apply (hidden) bibliometric filters for selection,
    e.g. minimum IF for inclusion in publication lists
    Don't apply arbitrary weights to co-authorship.
    Algorithms bases on author position might be
    problematic

    http://www.slideshare.net/paulwouters1/issi2013-wg-pw

    View Slide

  17. Unintended consequences
    17
    Researchers and other stakeholders
    adjust their behavior towards proxy
    indicators
    Both direct and indirect costs (time and
    money) increase
    Impact assessment becomes a self-
    sustained activity

    View Slide

  18. Unintended consequences
    18
    Tenure and promotion decisions are
    increasingly made by journal editors
    and administrators not peers
    The research and researchers that gets
    funded and promoted is heavily
    distorted by the system trying to assess
    its impact

    View Slide

  19. 19
    https://www.flickr.com/photos/splorp/5402551844

    View Slide

  20. What can be done then?
    20

    View Slide

  21. Be honest and responsible
    21
    For example:
    Don’t claim false sense of accuracy
    Be careful with aggregation

    View Slide

  22. Be open
    22
    As a provider of metrics data for research
    outputs
    Make your data openly available
    Make your code to generate the data
    openly available
    The above doesn’t preclude value-added
    services only available to customers

    View Slide

  23. Set Principles
    23
    The quality of research can’t be quantified
    Researchers can only be evaluated by
    their peers
    Proxy indicators are an important aid for
    evaluation by peers, but never a
    replacement
    Proxy indicators should not be used as an
    excuse for painful funding decisions

    View Slide

  24. Develop Best Practices
    24
    That are agreed upon in the community
    across stakeholders
    NISO is for example on best practices and
    standards for alternative assessment
    metrics

    View Slide

  25. Set a speed limit
    25
    Keep a healthy ratio of spending for
    research and spending for research
    assessment

    View Slide

  26. Be aware of the power of
    proxy indicators
    26
    Use them consciously for policy
    decisions, e.g.
    A funder wanting to promote research
    data publication
    An institution wanting to promote Open
    Access

    View Slide

  27. More research
    27
    For example:
    Better understanding of social media
    metrics data quality issues
    Appropriate indicators for social impact
    Research workflows instead of research
    outputs as indicators for impact
    Reproducibility as a possible metric

    View Slide

  28. More education
    28
    Naive approach to bibliometrics by
    many researchers who would never
    approach their own work like this

    View Slide

  29. This presentation is made available under a
    CC-BY 4.0 license.
    http://creativecommons.org/licenses/by/4.0/

    View Slide