Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Altmetrics for Research Assessment

Altmetrics for Research Assessment

An invited talk at the 2013 Society for Scholarly Publishing on Altmetrics for assessment of research.

William Gunn

May 24, 2013
Tweet

More Decks by William Gunn

Other Decks in Research

Transcript

  1. Metrics for Assessing
    Research Impact
    Society for Scholarly
    Publishing
    June 2013
    William Gunn, PhD
    Head of Academic Outreach
    @mrgunn
    https://orcid.org/0000-0002-3555-2054

    View Slide

  2. Based in London, Mendeley is
    researchers, graduates and software
    developers from...

    View Slide

  3. Key questions
    • What’s the problem?
    • What’s the opportunity?
    • How do we serve researchers?
    • How do we get the context
    right?

    View Slide

  4. What is the problem?
    • Is a researcher doing impactful research?
    • Does an institution have an effective
    research program?
    • Is the research being effectively
    disseminated?
    • Is the work advancing the field?

    View Slide

  5. How do you measure advancement?
    What are the units?

    View Slide

  6. The HIPPO
    https://secure.flickr.com/photos/doug88888/5937412419/

    View Slide

  7. People like numbers
    https://secure.flickr.com/photos/tifotter/2312080178/

    View Slide

  8. “Not everything that can be counted
    counts and not everything that
    counts can be counted.” – William
    Bruce Cameron (1967)

    View Slide

  9. Let research be research
    https://secure.flickr.com/photos/bert_m_b/2418461161

    View Slide

  10. http://xkcd.com/54/

    View Slide

  11. The Opportunity
    • Helping researchers make better decisions
    • Some answers
    – Find out what researchers really need & try to
    meet that need.
    • Some unanswered questions
    – What do the different metrics tell you?
    – Whose needs are best served by which info?
    • Some unasked questions

    View Slide

  12. What we do know
    • Amgen: 47 of 53 “landmark” oncology
    publications could not be reproduced.
    • Bayer: 43 of 67 oncology & cardiovascular
    projects were based on contradictory results
    • Dr. John Ioannidis: Of 432 publications
    purporting sex differences in hypertension,
    multiple sclerosis, or lung cancer, only one data
    set was reproducible.

    View Slide

  13. “authors who make
    data from their articles
    available are cited twice as
    frequently as articles with no
    data… but otherwise equal
    qualifications”
    Gleditsch, N.P. & Strand, H. Posting your data: will you be
    scooped or will you be famous? International Studies
    Perspectives 4, 89-97 (2003).
    http://dx.doi.org/10.1111/1528-3577.04105

    View Slide

  14. “The state of knowledge
    of the human race is
    sitting in the scientists’
    computers, and is
    currently not shared […]
    We need to get it
    unlocked so we can
    tackle those huge
    problems.”

    View Slide

  15. Post-scarcity in data
    “Preservation should
    be baked into the
    tools that we use… I
    would like for you to
    think of yourselves as
    people engaged in a
    task that is important
    to everyone and not
    just people in a
    scholarly niche.”

    View Slide

  16. ...and aggregates research
    data in the cloud
    Mendeley extracts
    research data…
    Mendeley makes science more collaborative and
    transparent:
    Install
    Mendeley Desktop

    View Slide

  17. http://www.nickjr.com/shows/dora/index.jhtml

    View Slide

  18. Future Directions
    • Let’s get better at understanding how to
    meet researcher needs.
    • Let’s get the context right for metrics.
    • Practical tips
    – Collect lots of data
    – Build expertise in analysis
    – Make that data available openly – it’s more
    valuable in aggregate than siloed apart.

    View Slide

  19. www.mendeley.com

    View Slide