Alternative metrics demonstrate the value and influence of scholars’ work apart from traditional citation counts and can enhance the impact of a CV. Altmetrics provide additional, supplementary information and can balance misleading metrics tied to particular journals. More timely than traditional metrics, altmetrics quickly reveal the impact of recent work and add authority to different types of scholarly products not captured as articles. Altmetrics can capture social media references that escape traditional metrics and reflect public engagement prompted by scholarly writing. The availability of altmetrics expands publishing opportunities to include new venues and stimulates innovative strategies for evaluating research. When included in a CV altmetrics must be accurate, clear and meaningful.
impact of scholarly output
Bulletin, April/May 2013
The Power of Altmetrics on a CV
by Heather Piwowar and Jason Priem
Altmetrics, tools measuring scholarly impact in an online environment, are displayed in a wide variety of places: journal article webpages, university press officer dashboards, data repository grant applications and many others.
In this article we focus on one particular application: including altmetrics on a scholar’s CV. Ambitious scholars have been including altmetrics on their CVs for years, for example indicating that a paper was recommended by Faculty of 1000, received a “Highly Accessed” download badge on BMC or was widely discussed in the media. As tools improve, we can anticipate these early-adopters will begin to incorporate a much wider range of altmetrics on a much wider range of products.
However, if we expect these early adopters to be joined by their more cautious peers, scholars will need a clearly articulated case for value. What are the benefits that will stand the test of time and that should motivate early and ongoing action? Librarians can help in this process.
We discuss 10 benefits to scholars and scholarship when altmetrics are embedded in a CV. Altmetrics as a class of measures
- provide additional information;
- de-emphasize inappropriate metrics;
- uncover the impact of just-published work;
- legitimize all types of scholarly products;
- recognize diverse impact flavors;
- reward effective efforts to facilitate reuse;
- encourage a focus on public engagement;
- facilitate qualitative exploration;
- empower publication choice; and
- spur innovation in research evaluation.
There are also risks to including altmetrics in a CV when it is done without care. We give several suggestions on how one should – and shouldn’t – include altmetrics in a CV. Finally, we close with a few ways that academic libraries can empower scholars to include altmetrics in their CVs, starting today.
Provide additional information
The primary benefit of including altmetrics on a CV is the inclusion of additional information. Readers of a CV can still assess the CV items just as they have always done: based on title, journal and author list, perhaps augmented by accessing the full research product for a custom qualitative assessment. In addition to exploring this data, with altmetrics, if the readers so choose, they can dig into post-publication impact of the work.
De-emphasize inappropriate metrics
Evaluating an article based on its journal title or journal impact factor is generally regarded as poor form: high journal impact factors vary across fields, an article often receives more or less attention than its journal container suggests, the authors may have selected a “low ranking” journal for the speed of its peer review or its open access status rather than its journal rank, and so on. For further details see www.zotero.org/groups/impact_factor_problems/items
Yet what else are readers of a CV to do? Most of us don’t have sufficient domain expertise to dig into each item and assess its merits based on a careful reading, even if we did have time. We need help, but traditional CVs don’t provide enough information to assess the work on anything but journal title.
Providing article-level citations and altmetrics in a CV gives readers more information, thereby de-emphasizing evaluation based on journal rank.
Uncover the impact of just-published work
Why not suggest that we include citation counts in CVs, and leave it at that? Why go so far as altmetrics? The reason is that altmetrics have benefits that complement the weaknesses of a citation-based solution, as we’ll cover in the next few points.
One of the most obvious benefits of altmetrics is timeliness. Citations take years to accrue. This delay is a big problem for graduate students who are applying for jobs soon after publishing their first papers and for those promotion candidates whose most profound work is published only shortly before review.
Multiple research studies have found that counts of downloads, bookmarks and tweets correlate with citations, yet accrue much more quickly, often in weeks or months rather than years. Using timely metrics allows researchers to showcase the impact of their most recent work.
Legitimize all types of scholarly products
Funders and institutions are beginning to explicitly welcome inclusion of datasets, software and other scholarly products in biosketches and CVs. This greater flexibility is great news for recognizing all worthwhile forms of research output, but how can readers of a CV know if the included dataset or software project is any good? What is the size and type of its contribution? Should they be impressed?
We often assess the quality and impact of a traditional research article based on the reputation of the journal that published it. This approach isn’t possible with alternative products. Data and software can’t be evaluated with a journal impact factor or journal ranking since repositories seldom select entries based on anticipated impact; they don’t have an impact factor; and, even if such a metric were possible, we surely don’t want to propagate the poor practice of judging the impact of an item by the impact of its container.
How, then, can alternative scholarly products be more than just space-filler on a CV, something that an evaluator counts but can’t appreciate?
Product-level metrics (like article-level metrics, but for more than just articles) provide the needed evidence to convince evaluators that a product has made a difference. Furthermore, because alternative products often make impacts in ways that aren’t fully captured by established attribution mechanisms, altmetrics will be key in communicating the full picture of how research products have influenced conversation, thought and behavior.
Recognize diverse impact flavors
The impact of a research paper has a flavor. Let’s imagine it as an ice cream flavor. The impact flavor might be champagne: a titillating discussion piece of the week. Or maybe it is a dark chocolate mainstay of the field. Strawberry: a great methods contribution. Tiger-stripe black licorice: controversial. Bubblegum: a hit in the classrooms. Low-fat vanilla: not very creamy, but it fills a need.
There probably aren’t 31 clear flavors of research impact. How many are there? Maybe five or seven or 12? We don’t know. But it would be a safe bet that, just like ice cream, our scholarship and society need them all. It depends whether we have a cone or a piece of apple pie. The goal isn’t to compare flavors: one flavor isn’t objectively better than another. They each have to be appreciated on their own merits for the needs they meet.
To appreciate the impact flavor of items on a CV we need to be able to tell the flavors apart. Imagine that for ice cream all you had to go by was a sweetness metric. Not happening, right? So too, citations alone can’t fully inform what kind of difference a research paper has made on the world. Important, but not enough.
We need more dimensions to distinguish the flavor clusters from each other. This is where altmetrics come in. By analyzing patterns in what people are reading, bookmarking, sharing, discussing and citing online we can start to figure out what kind – what flavor – of impact a research output is making.
It is worth noting that flavors are important for research products other than just papers. For example, some publicly available research datasets are used all the time in education but rarely in research; others are used once or twice by really impactful projects; others across a field for calibration; and so on. Understanding and recognizing these usage scenarios will be key in recognizing and rewarding the contributions of dataset creators.
More research is needed to understand the flavor palette, how to classify impact flavor and what it means
(http://researchremix.wordpress.com/2012/01/31/31-flavours/). In the meantime, exposing raw information about downloads, shares, bookmarks and the like starts to give a peak into impact flavor beyond just citations.
Reward efforts to facilitate reuse
Reusing research – for replication, follow-up studies and entirely new purposes – reduces waste and spurs innovation. Unfortunately, our common method of disseminating research through subscription-based static flat pdf articles makes research difficult to reuse.
If they choose, authors can make their research easier to reuse. For example, authors can make article text available for free or available for free with broad reuse rights (open access). They can choose to publish in places with liberal text-mining policies and investment in disseminating machine-friendly versions of articles and figures.
Authors can write detailed descriptions of their methods, materials, datasets and software. They can make their associated datasets and software openly available for reuse. Authors can go further, experimenting with executable papers, versioned papers, open peer review, semantic markup and so on.
When these additional steps are useful – when they do indeed result in additional reuse – the increased use will likely be reflected in downloads, bookmarks, discussions and possibly citations. Including altmetrics in CVs will reward investigators who have made these investments and encourage others to do so in the future.
Encourage a focus on public engagement
The research community, as well as society as a whole, benefits when research results are discussed outside the Ivory Tower: engaging the public is essential for future funding, recruitment and accountability. Today, however, researchers have little incentive to spend time engaging in public outreach or making their research accessible to the public. By highlighting evidence of public engagement like tweets, blog posts and mainstream media coverage, including altmetrics in a traditional CV will reward researchers who invest in public engagement activities.
Facilitate qualitative exploration
Including altmetrics in a CV isn’t all about the numbers! Just as we hope many people who skim our CVs will stop to read our papers and explore our software packages, so too we can hope that interested parties will click through to explore the composition and details of altmetrics engagement for themselves.
Who is discussing an article? What are they saying? Who has bookmarked a dataset? What are they using it for? As we discuss at the end of this article, including provenance information is crucial for trustworthy altmetrics; it also provides great information to move beyond the numbers and jump into qualitative exploration of impact.
Empower publication choice
Publishing in a new or innovative journal is risky: many authors are hesitant to publish their best work somewhere unusual, somewhere without a sky-high impact factor. Altmetrics will help to change this situation by highlighting work based on its post-publication impact rather than its journal title. Authors will be empowered to choose publication venues they feel are most appropriate, leveling the playing field for what might otherwise be considered risky choices.
Successful publishing innovators will also benefit. New journals won’t have to wait two years to get an impact factor before they can compete. Publishing venues that increase access and reuse will be particularly attractive. This change will spur new innovation and support the many new publishing options that have recently debuted, such as eLife, PeerJ, F1000 Research, Digital Humanities Quarterly and others.
Spur innovation in research evaluation
Finally, including altmetrics on CVs will engage researchers directly in research evaluation. Researchers are evaluated all the time, but often behind closed doors, using data and tools they don’t have access to (and frankly wouldn’t want to take the time to learn). Encouraging researchers to tell their own impact stories on their CVs, using broad sources of data, will help spur a much-needed conversation about how research evaluation is done and should be done in the future.
There are also risks to including altmetrics data on a CV, particularly if the data is presented or interpreted without due care or common sense.
Altmetrics data should be presented in a way that is accurate, auditable and meaningful. Accurate data is up-to-date, well-described and has been filtered to remove attempts at deceitful gaming. Auditable data implies completely open and transparent calculation formulas for aggregation, navigable links to original sources and access by anyone without a subscription. Meaningful data needs context and reference. Categorizing online activity into an engagement framework (http://blog.impactstory.org/2012/09/14/31524247207/) helps readers understand the metrics without becoming overwhelmed. Reference is also crucial. How many tweets is a lot? What percentage of papers are cited in Wikipedia? Representing raw counts as statistically rigorous percentiles, ideally localized to domain or type of product, makes it easy to interpret the data responsibly.
Assuming these presentation requirements are met, how should the data be interpreted? We strongly recommend that altmetrics be considered not as a replacement for careful expert evaluation but as a supplement. Particularly in these early days, we should view altmetrics as way to ground subjective assessment in real data – to start conversations, not end them.
Given this approach, at least three varieties of interpretation are appropriate: signaling, highlighting and discovery. A CV with altmetrics clearly signals that a scholar is abreast of innovations in scholarly communication and serious about supporting and communicating the impact of scholarship in meaningful ways. Altmetrics can also be used to highlight research products that might otherwise go unnoticed: a recent paper with a lot of tweets or a highly downloaded dataset or a track record of F1000-reviewed papers suggests work worthy of a second look. Finally, as we described in the exploration section above, auditable altmetrics data can be used by evaluators as a jumping off point for discovery about who is interested in the research, what they are doing with it and how they are using it.
How to Get Started
How can you add altmetrics to your own CVs or, if you are a librarian, also empower scholars to add altmetrics to theirs? Definitely start by experimenting with altmetrics for yourself. Play with the tools, explore and suggest improvements. Librarians can also spread the word on their campuses and beyond through talking, writing, teaching and outreach. Explicitly welcome evidence of impact when you solicit CVs for new positions, awards and grants. Last but not least, try it out for yourself!
Heather Piwowar is a postdoc at Duke University, studying the adoption and use of open research data. She is also a co-founder of ImpactStory (http://impactstory.org/), an open-source web tool that helps scholars track and report the broader impacts of their research. @researchremix
Jason Priem is a PhD student and Royster Fellow, studying information science at the University of North Carolina at Chapel Hill. He is also a co-founder of ImpactStory (http://impactstory.org/). @jasonpriem
Articles in this Issue
The Power of Altmetrics on a CV