! Don't use impact factors as measure of quality for individual researchers! Don't apply (hidden) bibliometric filters for selection, e.g. minimum IF for inclusion in publication lists! Don't apply arbitrary weights to co-authorship. Algorithms bases on author position might be problematic! Don't rank scientists according to one indicator. Ranking should not be merely based on bibliometrics! ! Wolfgang Glänzel and Paul Wouters, July 2013 http://www.slideshare.net/paulwouters1/issi2013-wg-pw
Factors, as a surrogate measure of the quality of individual research articles, to assess an individual scientist's contributions, or in hiring, promotion, or funding decisions. ! ! San Francisco Declaration on Research Assessment, May 2013 http://am.ascb.org/dora/
and where it is worthwhile to foster a scientific elite? In reality, no one actually knows, least of all the politicians who enthusiastically launch such excellence initiatives. ! ! This is where the idea of artificially staged competition comes in. It is assumed that these competitions will automatically make the best rise to the top—without the need to care about neither content nor purpose of research. ! ! Mathias Binswanger, January 2014! ! Binswanger, M. (2014). Excellence by Nonsense: The Competition for Publications in Modern Science. In Opening Science (pp. 49–72). Springer-Verlag. doi:10.1007/978-3-319-00026-8_3
PLOS Journals (HTML, PDF, XML) PubMed Central (HTML, PDF) Figshare (HTML, Downloads, Likes) Cited CrossRef Scopus Web of Science PubMed Central PMC Europe PMC Europe Database Links Recom- mended F1000Prime Discussed Twitter Facebook Wikipedia Reddit PLOS Comments ResearchBlogging ScienceSeeker Nature Blogs Wordpress.com OpenEdition Saved Mendeley CiteULike http://articlemetrics.github.io The data and software are openly available, and are used by several other publishers and CrossRef Labs.
years) • demonstrate reuse beyond the scholarly community • reflect how we communicate in 2014 • can often be easily collected through open APIs ! but … ! • poorly predict future citations, scholarly impact unclear • have often not been standardized • may come and go over time • are sometimes easy to game ! ➜ limited value for impact assessment of individual researchers
• Posters and presentations at conferences, in particular in some disciplines, e.g medicine or computer science • Electronic theses and dissertations (ETDs) • Performances in film, theater, and music • Blogs • Lectures, online classes, and other teaching
funded by the Sloan Foundation, first year to collect input from community ! After talking to more than 400 experts and stakeholders, the project published a white paper with potential action items for further work this week: http://www.niso.org/topics/tl/altmetrics_initiative/ National Information Standards Organization