Journal of the Association for Information Science



JASIS Will Be Fourteen in 1998
Donald H. Kraft




Bert R. Boyce




Experiments with Automatic Indexing and a Relational Thesaurus in a Chinese Information Retrieval System
Tian-Long Wan, Martha Evens, Yeun-Wen Wan, and Yuen-Yuan Pao

In our first article Wan and others test CIRS (Chinese Information Retrieval System), which compares vectors by intersection. The number of ones in the resulting vector is considered a measure of similarity for ranking. A thesaurus generated by inspecting the terms in the documents provided assistance in query formulation. A term consisted of one or more Chinese characters as selected by a human indexer or automatically from the texts. An evaluator, familiar with the 30 queries used, read 555 document abstracts and assigned those thought relevant to the proper query. Use of the thesaurus improved recall and precision measures, and at high recall levels automatic indexing out-preformed manual indexing.




Discovering Information Behavior in Sense Making
I. Time and Timing 1097
II. The Social 1109
III. The Person 1127
Paul Solomon

In a series of three papers Solomon describes an ethnographic study of human information behavior (characterized as "sense making") in the context of a work planning process carried out by a public agency on an annual cycle. Data were collected using direct recorded and informal observation, participant logs, interviews, and documentary traces. The process over time consisted of broad patterns of repetitive actions where time sensitive data collection occurred prior to use.

The agency adapts through a focus on cooperation with members of the legislature and demonstration of past success. The participants do not consider information gathering or processing activities as separate from their work, but rather part and parcel of their regular activities. Very important cues come from personal interactions within and outside the agency.

Individuals have varied sense making styles but the process terminates either when a deadline for action, or a point of satisfaction is reached, unless some other priority diverts attention. Information is sometimes not shared or pursued for perceived self interest or the possible violation of a confidence. Style diversity causes delay in the resolution of the process. It seemed a common practice to use information-seeking to justify decisions rather than to support a decision making process.




Europe and Information Science
Peter Ingwersen

Ingwersen, in a European research letter, indicates that only 34 of 278 articles in Scientometrics from 1994 to 1996 were produced in the United States. India contributed 17 articles and the rest were from Europe. A healthy system of networks and workshops seems to be leading to an increased interest throughout Europe in this area of research.




Preliminary Findings on Searcher Performance and Perceptions of Performance in a Hypertext Bibliographic Retrieval System
Dietmar Wolfram and Alexandra Dimitroff

In the first of two brief communications Wolfram and Dimitroff divide eighty-three subjects into expert and beginning searchers who are then randomly assigned to a basic or enhanced version of a prototype hypertext bibliographic retrieval system, and given two questions. A success measure, defined as recall (relevant pages visited/relevant pages) times "browsing precision" (relevant pages visited/total pages visited) divided by search time in hours was used.

Searchers were questioned on their confidence in complete recall. Confidence values are reported as between 1 and 6 where 1 to 3 was considered low and 4 to 6 high. A measure termed "reality check" decreases below 1 with searcher over confidence and increases above 1 with under confidence. For the question with a larger answer set, high confidence searchers searched for a significantly shorter period of time, and achieved significantly higher recall and success measures. The enhanced system provides significantly higher recall and success measures for the question with the smaller answer set. Searchers, by the "reality check" measure, are over confident of success.



ISI's Impact Factor as Misnomer: A Proposed New Measure to Assess Journal Impact
Stephen P. Harter and Thomas E. Nisonger

In our second brief communication Harter and Nisonger suggest that the traditional "impact factor" measure is a misnomer. "Impact factor" measures the current impact of a recently published article and offers a probabilistic estimate of the future impact of a paper published in the journal in question. This is a measure of journal quality, but it does not imply that two journals with the same factor have equal impact since the one that publishes more articles will have more effect on the scholarly community.

A better term would be article "impact factor," with the not normalized number of citations received being called "journal impact factor." A high "article impact factor" rank indicates that a journal publishes articles that are more often cited. A "journal impact factor" rank indicates the relative citation impact of the journal as a whole.




Bringing Design to Software
edited by Terry Winograd
reviewed by: Robert J. Sandusky




Technology and Copyright Law: A Guidebook for the Library, Research, and Teaching Professions
by Arlene Bielefield and Lawrence Cheeseman
reviewed by: David Mattison




The Economics of Communication and Information
edited by Donald M. Lamberton
reviewed by: Hal R. Varian














ASIS HomeSearch ASISMake A Comment

© 1998 , Association for Information Science
Last update: November 06, 1998