Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Altmetrics for Research Assessment

Altmetrics for Research Assessment

An invited talk at the 2013 Society for Scholarly Publishing on Altmetrics for assessment of research.

William Gunn

May 24, 2013
Tweet

More Decks by William Gunn

Other Decks in Research

Transcript

  1. Metrics for Assessing Research Impact Society for Scholarly Publishing June

    2013 William Gunn, PhD Head of Academic Outreach @mrgunn https://orcid.org/0000-0002-3555-2054
  2. Key questions • What’s the problem? • What’s the opportunity?

    • How do we serve researchers? • How do we get the context right?
  3. What is the problem? • Is a researcher doing impactful

    research? • Does an institution have an effective research program? • Is the research being effectively disseminated? • Is the work advancing the field?
  4. “Not everything that can be counted counts and not everything

    that counts can be counted.” – William Bruce Cameron (1967)
  5. The Opportunity • Helping researchers make better decisions • Some

    answers – Find out what researchers really need & try to meet that need. • Some unanswered questions – What do the different metrics tell you? – Whose needs are best served by which info? • Some unasked questions
  6. What we do know • Amgen: 47 of 53 “landmark”

    oncology publications could not be reproduced. • Bayer: 43 of 67 oncology & cardiovascular projects were based on contradictory results • Dr. John Ioannidis: Of 432 publications purporting sex differences in hypertension, multiple sclerosis, or lung cancer, only one data set was reproducible.
  7. “authors who make data from their articles available are cited

    twice as frequently as articles with no data… but otherwise equal qualifications” Gleditsch, N.P. & Strand, H. Posting your data: will you be scooped or will you be famous? International Studies Perspectives 4, 89-97 (2003). http://dx.doi.org/10.1111/1528-3577.04105
  8. “The state of knowledge of the human race is sitting

    in the scientists’ computers, and is currently not shared […] We need to get it unlocked so we can tackle those huge problems.”
  9. Post-scarcity in data “Preservation should be baked into the tools

    that we use… I would like for you to think of yourselves as people engaged in a task that is important to everyone and not just people in a scholarly niche.”
  10. ...and aggregates research data in the cloud Mendeley extracts research

    data… Mendeley makes science more collaborative and transparent: Install Mendeley Desktop
  11. Future Directions • Let’s get better at understanding how to

    meet researcher needs. • Let’s get the context right for metrics. • Practical tips – Collect lots of data – Build expertise in analysis – Make that data available openly – it’s more valuable in aggregate than siloed apart.