Day 9: Alternative Metrics
Welcome to Day 9 of the University Libraries Research Impact Challenge!
Yesterday we looked at the h-index, a calculation of author productivity and impact. Today we’ll look beyond citation-based metrics, considering the ways that alternative metrics (or “altmetrics”) can add to the picture of what we know about the impact of scholarship.
Let’s get started!
The term altmetrics, coined in 2010 in altmetrics: a manifesto, is defined as “the creation and study of new metrics based on the social web for analyzing, and informing scholarship.” (Priem, Taraborelli, Groth, and Neylon, 2010).
It’s easy to assume that altmetrics are all about social media (people tend to think of Twitter in particular), but that is only part of what they offer. By tracking links from all kinds of websites back to scholarly research, altmetrics can reveal references to and engagement with scholarship in the news, in policy documents, in syllabi, on scholarly blogs, and beyond.
Today's challenge introduces you to two tools that can give you insights into references to your work that go beyond citations: Altmetric and PlumX. Both are proprietary tools that search the web for "mentions" of research outputs, such as journal articles or book chapters, to show how readers are engaging with scholarly publications online. Mentions can appear in social media, scholarly blogs, news outlets, Wikipedia, citation managers like Mendeley, and more.
You may have seen Altmetric "donuts" or PlumX "artifact widgets" on journal websites, or next to publications in Experts@Minnesota.
The two services collect similar altmetrics, but they rarely match exactly. Checking both services for your own publications can alert you to different kinds of public or policy engagements and impacts. When you see one of the widgets, hover over it to see a summary of impact types.
Click on either widget to see the details behind the summary.
Using Experts@Minnesota, take some time today to review the altmetrics information associated with your own or a colleague's publications. What elements might you monitor to support your own impact story?
- Altmetric from Digital Science
- PlumX from Elsevier
- Wondering how to use altmetric data in a practical and responsible way?
- Tips and examples for using altmetrics to enhance a tenure and promotion case blog post from Digital Science Altmetric
- Altmetrics: Use Cases from University of Pittsburgh Library
- Where it all began: J. Priem, D. Taraborelli, P. Groth, C. Neylon. (2010). Altmetrics: A manifesto, 26 October 2010.
- The “hot takes”: Every year Altmetric announces a list of the 100 research outputs published in the last year, that also received the most attention in the last year. This list—and all the responses to it—provide an interesting perspective on what research captures people’s attention, how we can creatively analyze this information to make new inferences about the scholarly conversation, and plenty of healthy skepticism about altmetrics altogether. Check out these examples:
- Each year’s list
- An analysis of how the annual Top 100 have evolved over over the last five years (spoiler alert: things are looking grim)
- Some healthy skepticism (Note: This is a rare scenario where it’s worth reading the comments—the post mostly raises questions, which are discussed thoughtfully and from different perspectives in the comments section)
Preparing for the next challenge:
Congratulations! You’ve completed Day 9 of the University Libraries Research Impact Challenge, and we’re almost almost to the finish line—just one more day to go! Tomorrow, we’ll aim to prepare you to take your new knowledge and skills back out into the world by introducing frameworks for the responsible and ethical application of research impact metrics.
This material has been adapted from the University of Michigan research impact challenge LibGuide created by Rebecca Welzenback, January 15 2019 and is licensed under the Creative Commons Attribution 4.0 International License. http://creativecommons.org/licenses/by/4.0/