A series of interviews with four ASIS&T members, all outstanding scholars in metric-related research, provides insights into their interests, motivations and views on research in the area. Christine L. Borgman recalls being intrigued as a student when hearing about the history of bibliometrics, and she sees expanding metrics research methods as one of the most significant recent advances in the field. Blaise Cronin speaks of reading, meeting and corresponding with leaders in the field from early in his own career. He is impressed by the rise of journals other than JASIST covering informetrics, reflecting expanding interest in the field. Katherine W. McCain was drawn to the field when she was recognized for a citation analysis and collection assessment carried out while she served as a biology librarian. Among the challenges McCain identifies are more meaningful analysis of huge datasets, research funding and gaining understanding through the different perspectives of boundary spanners. Howard White recalls how a friendly conversation led to special access to a proprietary data gathering tool, resulting in the first author co-citation map and his co-authoring a highly acclaimed article. He notes significant advances in the mathematical foundations of bibliometrics and a rise in importance of bibliometrics for assessing national research programs. All four researchers are pleased to see the rise of altmetrics and data visualization capabilities, though they share concerns about the validity and reliability of metrics-based research and trustworthiness of platforms and data sets. 

interviews
scholars
career development
informetrics
research methods
data analysis
co-citation analysis
cross disciplinary fertilization

Bulletin, August/September 2012


Special Section

Taking the Measure of Metrics: Interviews with Four ASIS&T Members

by Cassidy R. Sugimoto

Metric-related research (for instance, bibliometric, scientometric and informetric) has played a prominent part in ASIS&T meetings and products. Research in this area is presented at the Annual Meeting, has appeared in numerous Annual Review of Information Science and Technology (ARIST) review chapters and is regularly seen in the pages of the Journal of the American Society for Information Science and Technology (JASIST) and other ASIS&T publications. Many active ASIS&T members are also highly lauded scholars in this area. In this article, we ask four such individuals to provide narratives of their introduction to the area and their assessment of the landscape for current and future research and to reflect upon the role of ASIS&T in the past, present and future of metrics-related research.

Christine L. BorgmanChristine L. Borgman
Presidential Chair and professor of information studies
Department of Information Studies
Graduate School of Education and Information Studies
University of California, Los Angeles

What is your earliest memory of metrics-related research at ASIS&T or in JASIS(T)? Were there any particular articles, presentations or lectures that made a distinct impression upon you? What was memorable about these? 
My memories of metrics-related research at ASIS&T/JASIST go back to my earliest student days. One memorable moment was the Research Award speech by Henry Small, who built the metrics capacity of ISI and who published extensively from those data. Henry is a bibliometrician and an historian of science extraordinaire. His acceptance speech was a rich and lively tour of how we got to where we were. 

What do you think are the most significant advances that have been made in metrics-related research in the last 25 years? Five years? Last year?
The most significant advances are twofold: (1) altmetrics, generally anything not based on references and citations per se, and (2) research methods. 

The MESUR project of Johan Bollen, Herbert van de Sompel and Marco Rodriguez moved the field along by mining a vast corpus of metrics and comparing them. The webometrics work first begun independently by Peter Ingwersen, Denmark, and Alastair Smith, New Zealand, was extended greatly by Mike Thelwall, United Kingdom, and his colleagues. Together, they applied known metrics to even messier data from web sources and then extended those metrics.

What do you think are the most pressing current challenges and opportunities in metrics-related research? How might ASIS&T be positioned to address these challenges and opportunities?
The most pressing challenges are methodological. We are flooded with data that can be used for information metrics, but identifying the origins of those data, their provenance and their context can be impossible. Research results are only as good as the evidence on which they are based and on the quality of the methods used to make inferences. Data, we've got. We're in a crisis of confidence in data.

What do you anticipate for the future of metrics-related research? What role do you envision the ASIS&T community playing in this future?
I anticipate more data, more players and not necessarily better evidence. The ability to standardize scholarly units such as authors through ORCID, funders through FundRef and similar developments may lead to cleaner data and more trustworthy results. 

ASIS&T can play a role in promoting expertise in research methods and as a platform for publication and presentation of quality results – and critiques of the questionable results that plague us.


Blaise CroninBlaise Cronin
Rudy Professor of Information Science
Editor-in-Chief, Journal of the American 
Society for Information Science and Technology
School of Library and Information Science
Indiana University Bloomington

 
What is your earliest memory of metrics-related research at ASIS&T or in JASIS(T)? Were there any particular articles, presentations or lectures that made a distinct impression upon you? What was memorable about these?

My interest in metrics dates from the late 1970s, quite some years before I came to the United States or became involved with ASIS&T. Back then I was an avid reader of JASIS (and its precursor American Documentation) as well as ARIST, so I was not unaware of the early contributions to research in citation analysis being made by people such as Henry Small, then at ISI in Philadelphia, and the brothers Stephen and Jonathan Cole at Columbia University. Henry continues to be one of the most innovative scholars working in the field today and, along with Howard White and Kate McCain, one of the most elegant authors on the subject of metrics. 

In the early 80s, while working at the Aslib Research & Development Department in London, I had the pleasure of meeting both the late Belver Griffith, a delightful polymath, and Eugene Garfield (a future president of ASIS&T). Belver stopped by my office, and from there we repaired to a local hostelry for a most congenial lunch discussing his work on scholarly communication. Around about the same time I happened to spot Gene in the Reading Room of the British Museum, went up to him and introduced myself. I had read his book Citation Indexing as well as his Essays of an Information Scientist and was thrilled to meet the man who launched the Science Citation Index and so much more. He said he’d just been reading my little book, The Citation Process, while taking a cruise up the Thames. I was well chuffed, to use a Britishism. 

Two others who fueled my interest in metrics were the historian of science Derek de Solla Price and the sociologist Robert Merton. Sadly, I never got to meet either, though I did have a lengthy correspondence with Merton, who, just before his death, contributed a chapter to the Festschrift that Helen Atkins and I edited for Garfield’s 70th birthday. More than 30 years later I am still (re-)reading the works of all of these individuals, without whom I could not have had the career I did and without whom this field would not be what it is.

What do you think are the most significant advances that have been made in metrics-related research in the last 25 years? Five years? Last year?
Unquestionably, the most significant advances of late have to do with the widespread availability of large-scale, digital datasets (Web of Science, Scopus, Google Scholar), which can be used either to model the dynamics of scholarly communication and/or evaluate research performance and productivity. I am one of those who grew up with the printed volumes of the SCI, SSCI and A&HCI. First generation data gathering and analysis was a slow, painful business, if one couldn’t afford online access; my eyesight has never recovered. Despite all the long-standing caveats to do with the validity and reliability of biblio/scientometic indicators, the easy availability of usage metrics of all kinds makes them almost irresistible to journal publishers, science policy makers, educational administrators, research councils, foundations and, indeed, scientists and scholars themselves, not least those of a narcissistic disposition. Of course, to quote Einstein, “Not everything that counts can be counted and not everything that can be counted counts.” Reading the ballooning literature on metrics (bibliometrics, scientometrics, informetrics, influmetrics, webometrics, cybermetrics, digimetics) it is hard not to feel that the Great Man’s adage is sometimes given regrettably short shrift. 

One of the most interesting recent developments has been the emergence of alt(ernative) metrics, and here I take my hat off to Jason Priem, a doctoral student in information science at UNC, Chapel Hill, whose youthful energy and enthusiasm have done much to propel early experimentation and debate in this area. Today, there are many new and emerging forms of scholarly communication, ranging from tweets through blogs to web-based open peer-reviewed journals. Increasingly, scholars share ideas, data and resources using a variety of social media (for example, Mendeley, Facebook, ResearchGate, CiteULike) and the tools and platforms exist to show how researchers’ opinions, data and insights diffuse throughout the wider scientific community and the kinds of downstream impact they have. Citations are now but one among many competing (or complementary) indicators of scholarly interaction and impact. It remains to be seen how many of the new entrants will achieve the longevity of the citation and how they might be factored into performance evaluation and research assessment exercises.

What do you think are the most pressing current challenges and opportunities in metrics-related research? How might ASIS&T be positioned to address these challenges and opportunities?
There are several major challenges facing metrics-based research. First, there exist persistent and warranted concerns regarding validity and reliability. What exactly do traditional citations and alternative indicators measure: quality, impact, influence, visibility, value, reputation…? Second, how transparent and trustworthy are the platforms and datasets upon which distributions, visualizations and comparative evaluations are based? Third, are all citations/acknowledgments/invocations/recommendations equal, or is differential weighting required? Further, how do we establish equivalence between, say, a citation from a Nobel laureate, six glowing mentions on Twitter and a high rating on Faculty of 1000? The idea of a symbolic capital currency convertor may not be all that far-fetched. Fourth, there are justified concerns that in a metrics-conscious age some scholars and researchers will find ways to game the system. To be sure, performativity is not a novel issue – it has been raised repeatedly by anti-citationists – but it is likely to take center stage as the number of indicators increases and evaluators’ reliance on these indicators, singly or conjointly, grows. So, for me, the big question is this: Will the availability of digital metrics and the concomitant ability to track and evaluate individuals and groups lead to the development of a cybernetically controlled system antipathetic to the traditional value system of the academy? 

Information science may be the ancestral home of metrics-related research, but we don’t own the subject nor can we regulate the ways in and reasons for which bibliometric techniques are used. I seriously doubt that any academic or professional body would attempt to require that researchers be certified in the theory and application of bibliometrics. Pandora’s box has been opened. What ASIS&T can do, however, is provide an agora for informed debate on the theoretical underpinnings and sociological significance of metrics-based evaluation of academic performance and through its conferences and publications ensure that the strengths and limitations of different types of metrics are properly understood. I would also encourage the Society to cooperate with other groups and organizations worldwide with an interest in developing, testing, validating and refining metrics of one kind or another.

What do you anticipate for the future of metrics-related research? What role do you envision the ASIS&T community playing in this future?
JASIST continues to be a major publisher of research, both theoretical and applied, relating to metrics, but the Journal certainly does not have a monopoly on the subject. Scientometrics continues to enlarge its already Yeti-like footprint, and it has been joined by the fast-growing Journal of Informetrics. There exists a vast, scattered literature on bibliometrics and allied areas, and it continues to expand at a quite remarkable rate; the field has moved from being a cottage industry to a robust academic specialty in its own right with, I’d wager, thousands of adherents, dedicated and transient, worldwide. 

I have some familiarity with the multi-faceted literature on metrics and I derive considerable satisfaction from the fact that individuals such as Howard White and Henry Small, two eminences of information science, continue to undertake research and publish on the subject with an authority and depth of understanding that few newcomers can match. That said, it is often the contributions of outsiders – think of Jorge Hirsch, a physicist, and his h-index – that revitalize the field and help push it in new directions. The future will be neither dull nor predictable. Let us hope that it does not degenerate into numerology.


Katherine W. McCainKatherine W. McCain
Professor
College of Information Science and Technology
Drexel University


 
 
What is your earliest memory of metrics-related research at ASIS&T or in JASIS(T)? Were there any particular articles, presentations or lectures that made a distinct impression upon you? What was memorable about these? 

Actually, my earliest memory of metrics-related research goes back to 1978, when Henry Small came to the weekly seminar of the biology department at Temple University in Philadelphia (where I ran the biology library) and talked about his collagen research. I thought it was interesting but my heart was still in marine biology. My next memory of metrics-related research was actually my own. Jim Bobick, the science librarian at Temple, heard a paper about using citation analysis to do collection assessment. So he set all of the departmental library managers to do citation analyses for their departments. My study got published in JASIS (McCain & Bobick, 1980) and also got me into the Ph.D. program, thanks to Belver Griffith and Carl Drott. I remember sitting in Belver’s office and talking about the citation distribution, going home and asking my husband (the economist) how one really demonstrates the long tail of a highly skewed distribution. I went back to Belver’s office a couple of days later with a Lorenz curve plot and a calculated Gini coefficient. Then I saw the maps that Howard White and Belver Griffith had started to do – author co-citation analysis – and decided that’s what I wanted to do – and I have, for the subsequent 30+ years.

What do you think are the most significant advances that have been made in metrics-related research in the last 25 years? Five years? Last year?
Twenty-five years? Visualization of citation and other networked data-mapping. Five years? The increased deployment of social network analysis tools into metrics-related research and related freeware like Pajek and HistCite. (OK – I’m shading it a bit – that’s 10 years.) Last year? The emergence of alt-metrics as an active and vibrant research area that focuses on the new data sources coming from social media.

What do you think are the most pressing current challenges and opportunities in metrics-related research? How might ASIS&T be positioned to address these challenges and opportunities? 
Challenges – finding ways to mine really large metrics datasets and do more with them than just report network stats and related quantitative results. The real challenge, IMHO, has always been to connect the metrics data with the folks who produce and use the information resources, make the references/citations, etc. It’s partly an issue of scale (the really big vs. the in-depth case study) but also an issue of access to the scholars who are writing and citing. But then I’m a natural historian who likes to look under rocks and haul up old rubber tires at marinas to see what’s there. Another challenge is, at least for academics, to find ways to get funded to support work. How might ASIS&T be positioned? We’ve got boundary spanners – continue to find ways to bring in some of the communities that share interests but have different perspectives and toolkits. Howard White is a good example of this, if you look at his writings on the psychological approach to relevance, the intersection of discourse analysis and citation analysis, the intersection of social network analysis and ASIS&T-style metrics.

What do you anticipate for the future of metrics-related research? What role do you envision the ASIS&T community playing in this future?
I’m not much of a prognosticator – I’ll leave this to younger folks. 


Howard White
Professor emeritus
College of Information Science and Technology
Drexel University


 
 
What is your earliest memory of metrics-related research at ASIS&T or in JASIS(&T)? Were there any particular articles, presentations or lectures that made a distinct impression upon you? What was memorable about these?

An early influence was the session on the mapping of science convened by Derek Price at the ASIS&T conference in Minneapolis in 1979. He presented, as did B. C. Brookes and Daniel Sullivan. (Henry Small was also scheduled to present, but yielded his time to allow Brookes to speak longer.) Gerard Salton attended, and as I recall, said something derogatory about the whole science-mapping project – perhaps just to be provocative. The most significant event in Minneapolis for me, however, was that, in a private conversation, I explained my budding research on author co-citation to Charles Bourne, whom I knew from UC Berkeley and who was then a researcher at Dialog; he told me about an in-house Dialog command called Intersect that would let me obtain co-citation counts for seven or eight author pairs at a time, rather than one pair at a time. That greatly sped up the data-gathering for the first author co-citation map – the one of information scientists that initially appeared in the 1980 book Key Papers in Information Science, edited by Belver Griffith, and then in the 1981 White-Griffith paper in JASIS that launched author co-citation as a research method. Belver, my colleague on the Drexel faculty, was a close friend of Derek Price and sent him a pre-publication draft of the White-Griffith paper. To my delight, Derek said that it was the best paper he had seen in a long time. At Drexel we got several more published articles out of the Intersect command before Dialog shut it down – on the grounds that outsiders’ use of it was degrading search performance for the Dialog system!

What do you think are the most significant advances that have been made in metrics-related research in the last 25 years? Five years? Last year?
Last 25 years: the steady improvement in our ability to visualize bibliometric data. Also the strides made by bibliometricians such as Leo Egghe, Ronald Rousseau and Wolfgang Glänzel in providing mathematical foundations for the field. Last five years: the explosion of interest in the h-index and related measures for assessing personal and group achievements through publication and citation counts. Also, the gradual political gains bibliometrics has made as a complement to peer review in national assessments of research programs. Last year: the heightened visibility of bibliometric research through the appearance of Katy Börner’s Atlas of Science: Visualizing What We Know. We have our own coffee-table book!

What do you think are the most pressing current challenges and opportunities in metrics-related research? How might ASIS&T be positioned to address these challenges and opportunities?
The ready availability of large-scale bibliometric visualization packages such as Pajek and VOS affords ample opportunity for new metrics-related research. At the same time, I believe we need to look more critically, in the style of Edward Tufte, at what bibliometric maps are good at conveying. There has been a great proliferation of such maps, and some of the most striking and beautiful also seem to me quite bad at revealing anything useful about the scientific fields or activities they purport to clarify. Terrible though it may be to contemplate, a few passages of old-fashioned text might actually do a better job. So I would like to see ASIS&T sponsor some sort of forum in which the aesthetics and utilities of bibliometric mapping could be candidly discussed.

What do you anticipate for the future of metrics-related research? What role do you envision the ASIS&T community playing in this future?
A major gain would be really rapid production of genuinely useful bibliometric intelligence to individuals on demand. This would involve bigger databases (e.g., integrations of citation data from books and proceedings as well as journals), cleaner data (e.g., proper disambiguation of author homonyms) and software well-tested for value by users. ASIS&T will surely offer opportunities to discuss and publish research on this broad topic as one of the most central to information science. 


Cassidy R. Sugimoto is an assistant professor in the School of Information and Library Science, Indiana University Bloomington. She can be reached at sugimoto<at>indiana.edu.