Slide 1

Slide 1 text

The Trouble with Impact and other Stories Martin Fenner PLOS Technical Lead Article-Level Metrics http://orcid.org/0000-0003-1419-2405 https://www.flickr.com/photos/bufivla/3619927485

Slide 2

Slide 2 text

2 #ESC2015 https://www.flickr.com/photos/andersabrahamsson/16487813070

Slide 3

Slide 3 text

What makes a winner? 3 Picking music tastes of the audience Competitiveness of candidate selection Marketing (e.g. airtime) before the event Loyality of neighbouring countries

Slide 4

Slide 4 text

4 RESEARCH ARTICLE The Oligopoly of Academic Publishers in the Digital Era Vincent Larivière1,2*, Stefanie Haustein1, Philippe Mongeon1 1 École de bibliothéconomie et des sciences de l’information, Université de Montréal, C.P. 6128, Succ. Centre-Ville, Montréal, QC. H3C 3J7, Canada, 2 Observatoire des Sciences et des Technologies (OST), Centre Interuniversitaire de Recherche sur la Science et la Technologie (CIRST), Université du Québec à Montréal, CP 8888, Succ. Centre-Ville, Montréal, QC. H3C 3P8, Canada * vincent.lariviere@umontreal.ca Abstract The consolidation of the scientific publishing industry has been the topic of much debate within and outside the scientific community, especially in relation to major publishers’ high profit margins. However, the share of scientific output published in the journals of these major publishers, as well as its evolution over time and across various disciplines, has not yet been analyzed. This paper provides such analysis, based on 45 million documents in- dexed in the Web of Science over the period 1973-2013. It shows that in both natural and medical sciences (NMS) and social sciences and humanities (SSH), Reed-Elsevier, Wiley- Blackwell, Springer, and Taylor & Francis increased their share of the published output, es- pecially since the advent of the digital era (mid-1990s). Combined, the top five most prolific publishers account for more than 50% of all papers published in 2013. Disciplines of the so- cial sciences have the highest level of concentration (70% of papers from the top five pub- lishers), while the humanities have remained relatively independent (20% from top five publishers). NMS disciplines are in between, mainly because of the strength of their scientif- ic societies, such as the ACS in chemistry or APS in physics. The paper also examines the a11111 OPEN ACCESS Citation: Larivière V, Haustein S, Mongeon P (2015) The Oligopoly of Academic Publishers in the Digital Era. PLoS ONE 10(6): e0127502. doi:10.1371/ journal.pone.0127502 Academic Editor: Wolfgang Glanzel, Katholieke Universiteit Leuven, BELGIUM Received: January 14, 2015 Accepted: March 24, 2015 Published: June 10, 2015 Copyright: © 2015 Larivière et al. This is an open http://doi.org/10.1371/journal.pone.0127502

Slide 5

Slide 5 text

What makes a winner? 5 Picking mainstream research topics Recognition of researcher, institution and journal Networking in the community, e.g. at conferences

Slide 6

Slide 6 text

Quality of research can’t be quantified… 6

Slide 7

Slide 7 text

… so we are using proxies instead 7

Slide 8

Slide 8 text

Journal articles instead of all research outputs, or the research itself 8

Slide 9

Slide 9 text

Scientific impact, i.e. popularity among peers, instead of quality 9

Slide 10

Slide 10 text

Citations as a proxy for popularity among peers 10

Slide 11

Slide 11 text

Citations in selected journals as a proxy for citations 11

Slide 12

Slide 12 text

Social media metrics as a proxy for social impact 12

Slide 13

Slide 13 text

Mean citations per journal instead of citations for individual article 13

Slide 14

Slide 14 text

14 https://en.wikipedia.org/wiki/File:Snoopy_wwi_ace_lb.jpg https://www.flickr.com/photos/alainpicard/4175215815 In the end we describe something very different

Slide 15

Slide 15 text

Problems with the current state of impact assessment 15 Many serious gaps in both theory and practical application Unintended consequences

Slide 16

Slide 16 text

Problems of Individual-Level Evaluative Bibliometrics 16 Don't reduce individual research performance to a single number Don't use impact factors as measure of quality for individual researchers Don't apply (hidden) bibliometric filters for selection, e.g. minimum IF for inclusion in publication lists Don't apply arbitrary weights to co-authorship. Algorithms bases on author position might be problematic … http://www.slideshare.net/paulwouters1/issi2013-wg-pw

Slide 17

Slide 17 text

Unintended consequences 17 Researchers and other stakeholders adjust their behavior towards proxy indicators Both direct and indirect costs (time and money) increase Impact assessment becomes a self- sustained activity

Slide 18

Slide 18 text

Unintended consequences 18 Tenure and promotion decisions are increasingly made by journal editors and administrators not peers The research and researchers that gets funded and promoted is heavily distorted by the system trying to assess its impact

Slide 19

Slide 19 text

19 https://www.flickr.com/photos/splorp/5402551844

Slide 20

Slide 20 text

What can be done then? 20

Slide 21

Slide 21 text

Be honest and responsible 21 For example: Don’t claim false sense of accuracy Be careful with aggregation

Slide 22

Slide 22 text

Be open 22 As a provider of metrics data for research outputs Make your data openly available Make your code to generate the data openly available The above doesn’t preclude value-added services only available to customers

Slide 23

Slide 23 text

Set Principles 23 The quality of research can’t be quantified Researchers can only be evaluated by their peers Proxy indicators are an important aid for evaluation by peers, but never a replacement Proxy indicators should not be used as an excuse for painful funding decisions

Slide 24

Slide 24 text

Develop Best Practices 24 That are agreed upon in the community across stakeholders NISO is for example on best practices and standards for alternative assessment metrics

Slide 25

Slide 25 text

Set a speed limit 25 Keep a healthy ratio of spending for research and spending for research assessment

Slide 26

Slide 26 text

Be aware of the power of proxy indicators 26 Use them consciously for policy decisions, e.g. A funder wanting to promote research data publication An institution wanting to promote Open Access

Slide 27

Slide 27 text

More research 27 For example: Better understanding of social media metrics data quality issues Appropriate indicators for social impact Research workflows instead of research outputs as indicators for impact Reproducibility as a possible metric

Slide 28

Slide 28 text

More education 28 Naive approach to bibliometrics by many researchers who would never approach their own work like this

Slide 29

Slide 29 text

This presentation is made available under a CC-BY 4.0 license. http://creativecommons.org/licenses/by/4.0/