Altmetrics for Research Assessment

Altmetrics for Research Assessment

An invited talk at the 2013 Society for Scholarly Publishing on Altmetrics for assessment of research.

1234f1a875369df95753efe40ee9471c?s=128

William Gunn

May 24, 2013
Tweet

Transcript

  1. Metrics for Assessing Research Impact Society for Scholarly Publishing June

    2013 William Gunn, PhD Head of Academic Outreach @mrgunn https://orcid.org/0000-0002-3555-2054
  2. Based in London, Mendeley is researchers, graduates and software developers

    from...
  3. Key questions • What’s the problem? • What’s the opportunity?

    • How do we serve researchers? • How do we get the context right?
  4. What is the problem? • Is a researcher doing impactful

    research? • Does an institution have an effective research program? • Is the research being effectively disseminated? • Is the work advancing the field?
  5. How do you measure advancement? What are the units?

  6. The HIPPO https://secure.flickr.com/photos/doug88888/5937412419/

  7. People like numbers https://secure.flickr.com/photos/tifotter/2312080178/

  8. “Not everything that can be counted counts and not everything

    that counts can be counted.” – William Bruce Cameron (1967)
  9. Let research be research https://secure.flickr.com/photos/bert_m_b/2418461161

  10. http://xkcd.com/54/

  11. The Opportunity • Helping researchers make better decisions • Some

    answers – Find out what researchers really need & try to meet that need. • Some unanswered questions – What do the different metrics tell you? – Whose needs are best served by which info? • Some unasked questions
  12. What we do know • Amgen: 47 of 53 “landmark”

    oncology publications could not be reproduced. • Bayer: 43 of 67 oncology & cardiovascular projects were based on contradictory results • Dr. John Ioannidis: Of 432 publications purporting sex differences in hypertension, multiple sclerosis, or lung cancer, only one data set was reproducible.
  13. “authors who make data from their articles available are cited

    twice as frequently as articles with no data… but otherwise equal qualifications” Gleditsch, N.P. & Strand, H. Posting your data: will you be scooped or will you be famous? International Studies Perspectives 4, 89-97 (2003). http://dx.doi.org/10.1111/1528-3577.04105
  14. “The state of knowledge of the human race is sitting

    in the scientists’ computers, and is currently not shared […] We need to get it unlocked so we can tackle those huge problems.”
  15. Post-scarcity in data “Preservation should be baked into the tools

    that we use… I would like for you to think of yourselves as people engaged in a task that is important to everyone and not just people in a scholarly niche.”
  16. ...and aggregates research data in the cloud Mendeley extracts research

    data… Mendeley makes science more collaborative and transparent: Install Mendeley Desktop
  17. http://www.nickjr.com/shows/dora/index.jhtml

  18. Future Directions • Let’s get better at understanding how to

    meet researcher needs. • Let’s get the context right for metrics. • Practical tips – Collect lots of data – Build expertise in analysis – Make that data available openly – it’s more valuable in aggregate than siloed apart.
  19. www.mendeley.com