For institutional repositories, alternative metrics reflecting online activity present valuable indicators of interest in their holdings that can supplement traditional usage statistics. A variable mix of built-in metrics is available through popular repository platforms: Digital Commons, DSpace and EPrints. These may include download counts at the collection and/or item level, search terms, total and unique visitors, page views and social media and bookmarking metrics; additional data may be available with special plug-ins. Data provide different types of information valuable for repository managers, university administrators and authors. They can reflect both scholarly and popular impact, show readership, reflect an institution’s output, justify tenure and promotion and indicate direction for collection management. Practical considerations for implementing altmetrics include service costs, technical support, platform integration and user interest. Altmetrics should not be used for author ranking or comparison, and altmetrics sources should be regularly reevaluated for relevance.

altmetrics
digital repositories
impact of scholarly output
statistics
collection management
social web

Bulletin, April/May 2013


New Opportunities for Repositories in the Age of Altmetrics

by Stacy Konkiel and Dave Scherer

University administrators are increasingly trying to find new ways to measure the impact of the scholarly output of their faculty, students and researchers through quantitative means. By reporting altmetrics (alternative metrics based on online activity) for their content, institutional repositories can add value to existing metrics – and prove their relevance and importance in an age of growing cutbacks to library services. This article will discuss the metrics that repositories currently deliver and how altmetrics can supplement existing usage statistics to provide a broader interpretation of research-output impact for the benefit of authors, library-based publishers and repository managers, and university administrators alike. 

Metrics Repositories Currently Deliver
Many repository platforms measure usage statistics such as download counts and page views. Less often, repositories report citation counts and altmetrics culled from the social web for their holdings. Here, we will look at usage statistics that are commonly reported on the three most popular repository platforms in use today: Digital Commons, DSpace and EPrints. 

Digital Commons. Digital Commons is a proprietary institutional repository and journal-publishing platform run by Bepress. Relying on proprietary, COUNTER-compliant download counts [1] and Google Analytics as a source for metrics on access, the platform records download counts, search terms and referral links for all content held in each repository. These metrics are communicated to repository managers, series administrators and authors via email. The platform provides metrics on publications available to date in each repository, downloads to date, and downloads during the lifetime of the repository. Authors also receive statistics on their deposits through a private Author Dashboard interface. 

The platform also operates a federated search and discovery mechanism, the Digital Commons Discipline Browser, that provides repository managers, authors and users with usage metrics within the network of three-tiered, taxonomy-based disciplines contained in Digital Commons repositories. Google Analytics results provide metrics on access, including number of visitors, number of unique visitors, pageviews and bounce rate. Currently, there is no option to display these metrics to repository visitors or journal readers. 

Bepress recently partnered with Altmetric.com (www.altmetric.com) to display an Altmetric “badge” for items in selected series and collections within Digital Commons and Bepress journals. Altmetric.com can display article level metrics related to social bookmarking (Mendeley, Citeulike, Connotea) and social media (Facebook, Twitter, Google+, Reddit and LinkedIn). 

DSpace. DSpace is an open-source, configurable platform that delivers only download counts to researchers as part of its base install (Versions 1.6+). Metrics can either be displayed openly (where enabled by repository managers) or to administrators only. Repository, community, collection and item-level download counts are displayed via an HTML table. Citation metrics are available if a DSpace plug-in is installed (Versions 1.6+), where the institution has subscription access to the SciVerse Scopus API (www.developers.elsevier.com/devcms/cited-by-count-api).

Repositories running DSpace with the help of the BMC-backed Open Repositories service can display altmetrics at the item level. These repositories openly report metrics related to social bookmarking (Del.icio.us, Citeulike, Connotea) and social media (Facebook, Stumbleupon, Digg and LinkedIn) in addition to download counts.

EPrints. Similarly, the open source repository platform EPrints tracks only downloads as an out-of-the-box feature. However, they have fairly robust reporting tools available: line, bar and pie graphs; HTML tables and CSV exports. Download counts can be measured at the repository, collection and item levels. Statistics can be hidden from public view, accessible only to repository administrators. Citation metrics are available as a repository plug-in (Versions 3.2+) (http://files.eprints.org/641/); like the DSpace plug-in, subscription access to the SciVerse Scopus API is required. The few EPrints repositories to offer altmetrics have implemented them through “homegrown” means.

How Metrics Can Be Used
Usage and citation statistics can reveal many things to both authors and repository administrators, including the demographics of those accessing their scholarly outputs and what types of content are most popular. Authors can use these numbers to gain basic insight into the reach of their scholarship and can supplement their tenure and promotion dossiers with numbers more granular (and some say more meaningful) than journal impact factors. Repository administrators can use usage statistics to help promote similar content within their institutional repositories (IR), supplement their collection development policies and provide evidence to university administration as to the impact of their university’s intellectual output [2]. Using altmetrics, some repositories have been successfully able to showcase the social importance of repository content to the general public in non-academic settings [3].

What usage statistics do not always reveal is the nature of use or the context for how scholarship is consumed [2]. Altmetrics can help to fill in some of the knowledge gaps that usage statistics alone cannot address. In the following we provide some possible use cases for altmetrics as a supplementary type of measure that is of use to three different user groups: authors, repository managers and university administrators. 

Value of Altmetrics to Authors
1. Altmetrics can help authors better understand the readership of their open access (OA) content. Many altmetrics tracking services, including Altmetric.com and ImpactStory, not only document basic usage statistics, but also capture information about readers and how they use content. For example, ImpactStory’s inclusion of Topsy (a Twitter feed archiving platform) metrics’ links to the individual tweets that mention specific articles showcases not only who is reading and sharing scholarship, but also what they are saying about it. Altmetric.com’s content dashboard also showcases sophisticated demographic reports for readers. Giving authors insight into their readership can help them better understand how their OA content stored in IRs is making an impact.

2. As supplements to the journal impact factor, altmetrics can help authors document the impact of their research when compiling tenure and promotion dossiers. The journal impact factor (JIF) is the de facto standard in many academic fields for determining the quality of articles. Many researchers include the JIFs for journals in which they have published on their vitas when going up for tenure or promotion, as a means of documenting the impact of their work. By also including supplemental measures of impact (usage counts and altmetrics) for traditional publications, as well as grey literature and other outputs deposited in IRs, faculty can more fully document the impact of their scholarship.

Value of Altmetrics to Repositories
3. Repository managers can use altmetrics to persuade potential depositors that there is value in making their content openly accessible. As Harnad contends, “The prospect of increasing their usage and citation metrics (and their attendant rewards) is an incentive to researchers to provide Open Access to their findings.” [4, p. 6] The possibility of increasing altmetrics counts would arguably have a similar effect on deposit rates. 

4. Gathering numbers beyond general usage statistics can better communicate to repository funders – most often, university administrators – the value of the repository as a platform for hosting OA content. While general usage statistics might not tell a very informative story about the impact of a particular repository deposit, seeing how content is used and shared (on which websites, by which demographics and for what purposes) can. Similarly, by tracking non-academic use of content, repository managers can build a case for community engagement.

5. Altmetrics can supplement existing usage statistics to help plan collection development, resource allocation and marketing/outreach. Altmetrics such as F1000 scores and scholarly social bookmarking sites, in particular, can provide insight into what scholarship is making an impact within specific user groups. By tracking which collections and subjects are popular within a repository, IR administrators can better plan outreach activities. Such altmetrics can also be used to strengthen departmental engagement, which in turn could help build collections.

Value of Altmetrics to University Administrators
6. Administrators can use altmetrics as supplementary indicators of impact when showcasing university scholarship to both internal and external stakeholders. In particular, tracking altmetrics alongside traditional metrics can shed light on impact for university trustees and state legislatures when requesting budget increases, recruiting faculty, etc. [5].

7. Altmetrics can be used by faculty review committees (such as awards boards or promotion and tenure review systems) to better understand how a particular researcher’s work has been received by scholarly and lay communities [5]. Altmetrics as supplemental measures of impact not only help authors and IR managers better understand the reach of scholarship, but can also help faculty review committees do so. 

Repositories will likely decide if they will implement altmetrics based on a number of factors, including possible service costs, technical support needs, platform integration restrictions (open source or proprietary) and user interest. In addition to the most popular out-of-the-box altmetrics services (Altmetric, ImpactStory and Plum Analytics), there are many ad hoc possibilities for mining and displaying altmetrics for repository content by way of web service APIs and open source tools like PLOS’s Article Level Metrics package.

Library adoption will likely be customized to meet the demands not only of authors and repository managers, but also of university administrators. There are two “flavors” of impact (to borrow a term) – scholarly and popular – that repository managers should keep in mind when considering implementing altmetrics. Metrics that fall within those two categories (which are by no means mutually exclusive) should be judged in tandem with the authority and relevance of the web services that provide them and the possible value those metrics would provide to stakeholders.

There are a number of traditional and alternative metrics that track scholarly impact, including citations (sourced from Scopus and PubMed Central), Bookmarks (Mendeley, CiteULike), Faculty of 1000 reviews, and blog mentions on research blog networks. These metrics are sourced from websites and services that track usage of scholarship at various points in the research life cycle, from reading to writing to post-publication peer review. 

Studies have shown that for OA content, traditional measures of scholarly impact (citations) are often closely related to altmetrics measures (social reference management bookmarks) and that for a variety of reasons some scholarly altmetrics can be a better indicator of impact than traditional usage statistics [6], [7], [8]. As supplementary metrics, scholarly altmetrics can prove value for OA content, including content held by repositories.

Popular impact metrics generally rely upon measuring the social web, including Wikipedia citations, Bit.ly clicks and shares, Facebook likes and shares, Del.icio.us bookmarks, Reddit mentions, Twitter mentions and influential tweets, general interest blog mentions and news media mentions. They can be useful when determining the reach of scholarship within a lay audience, though it is worth noting that many researchers use social media for scholarly pursuits, and so at least a portion of popular metrics are accounted for there. Occasionally, popular impact metrics can predict later citations [9].

Page views and download counts fall within a gray area of possible impact, as usage statistics generally reveal little about the end users and what they will do with that which they download. Studies have shown that page views and download counts for OA content are correlated with scholarly citations and Facebook shares, alike [10] [8].These metrics can provide general insights and should be considered carefully alongside other metrics when reporting the possible impact of research.

Altmetrics excel over current impact measures such as citations and usage metrics in the area of sentiment analysis. Though in its infancy, some researchers have shown that by combining text mining with altmetrics you can begin to understand how users regard the content they are sharing, liking and bookmarking [6], [11].

Existing barriers to participation are cost, IR technical support resources, inability to incorporate tools into proprietary platforms, limited DOI (digital object identifier) implementation in most repositories, author disambiguation issues and the political implications of displaying nonexistent metrics for relatively unpopular IR materials. Areas for caution are using altmetrics (or any other metrics) to rank or compare researchers and conflating the types of impact with one another – scholarly metrics usually cannot replace popular metrics and vice versa. As certain web services lose their relevancy, their inclusion in altmetrics reports should be reconsidered (for instance, Digg).

Resources Mentioned in the Article
[1] Counting Online Usage of Networked Electronic Resources (COUNTER). (April 2012). The COUNTER Code of Practice for e-Resources: Release 4. Retrieved February 27, 2013 from www.projectcounter.org/r4/COPR4.pdf 

[2] Organ, M. K. (2006). Download statistics - What do they tell us? The example of Research Online, the open access institutional repository at the University of Wollongong, Australia. D-Lib Magazine, 121(11). Retrieved February 27, 2013, from www.dlib.org/dlib/november06/organ/11organ.html 

[3] Digital Commons. (December 11, 2012). Purdue Libraries creates buzz at the Indiana State Fair [blog post]. DC Telegraph. Retrieved February 27, 2013, from http://blog.digitalcommons.bepress.com/2012/12/11/purdue-libraries-create-buzz-at-the-indiana-state-fair/

[4] Harnad, S. (May 30, 2008). Validating research performance metrics against peer rankings. Ethics and Science in Environmental Politics, 8(11), 103-107. Retrieved February 27, 2013, from http://eprints.ecs.soton.ac.uk/15619/ 

[5] Carr, L., Weal, M., & White, W. (April 16, 2010). Research assessment and a diverse role for repositories [slides]. Talk presented at the 5th International Conference on Open Repositories, Madrid, Spain, July 6-9, 2010 (OR2010). Retrieved February 27, 2013, from http://eprints.soton.ac.uk/270842/7/or2010Diverse.pdf 

[6] Haustein, S., & Siebenlist, T. (2011). Applying social bookmarking data to evaluate journal usage. Journal of Informetrics, 5(3), 457–446. 

[7] Kraker, P., Körner, C., Jack, K., & Granitzer, M. (2012). Harnessing user library statistics for research evaluation and knowledge domain visualization. Proceedings of the 21st International Conference Companion on World Wide Web - WWW ’12 Companion (pp. 1017–1024). New York: ACM Press.

[8] Priem, J., Piwowar, H. A., & Hemminger, B. M. (March 2012). Altmetrics in the wild: Using social media to explore scholarly impact. arXchiv.org. Retrieved February 27, 2013, from http://arxiv.org/html/1203.4745v1

[9] Eysenbach, G. (2011). Can tweets predict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact. Journal of Medical Internet Research, 13(4). Retrieved February 27, 2013, from www.jmir.org/2011/4/e123 

[10] Brody, T., Harnad, S., & Carr, L. (2006). Earlier web usage statistics as predictors of later citation impact. Journal of the American Society for Information Science and Technology, 57(8), 1060–1072.

[11] Parra, C., Birukou, A., Casati, F., Saint-Paul, R., Wakeling, J. R., & Chlamtac, I. (June 15, 2011). UCount: A community-driven approach for measuring scientific reputation. [Version 0]. Abstract accepted for presentation at altmetrics11, Koblenz, Germany, June 14-15, 2011. Retrieved from http://altmetrics.org/workshop2011/parra-v0/ 


Stacy Konkiel is an eScience librarian at Indiana University. She can be reached at skonkiel <at> indiana.edu.

Dave Scherer is a scholarly repository specialist at the Purdue ePubs Repository. He can be reached at dscherer<at> purdue.edu.