JASIST IndexJASIST Table of Contents

Journal of the Association for Information Science and technology

 EDITORIAL

 

In This Issue
   
Bert R. Boyce
 

699

 

 RESEARCH

 

Analysis of Publications and Citations from a Geophysics Research Institute
    
Cliff Frohlich and Lynn Resler
     Published online 7 May 2001

In this issue we begin with the Frohlich and Resler report on the evaluation of research productivity at the University of Texas Institute for Geophysics by bibliometric indicators using a four part categorization of journals into mainstream, archival, proceedings and other, and five different journal halflife computations. Halflife computations are drawn from work on earthquakes, which like citations do not decay strictly exponentially, requiring formulations avoiding such features. Weighting to emphasize points with more numerous and reliable data seem called for in both instances, and in both the discrete period of reporting can bias the results. This implies ignoring very high and low magnitude data as found in the first year or two of citation and data older than some point, weighting data with more observations more strongly when fitting, and perhaps correcting for the interval problem. A sequence of rates by year will yield a median age of citation, or by dividing 10 log2 by the difference in the logs of two rates, commonly the second and twelfth year rates, a two point halflife. If one determines the slope of the same 11 points, weighting each number by the number of articles used to determine the rate, a negative log2 divided by this slope will provide a weighted least squares halflife. If the sum of the products of the rates and their times in years is divided by the sum of the rates we get a rate averaged time and the maximum likelihood half life is natural log2 times the difference between the rate average time and the minimum rate time. This relatively easily computed method is in common use in earthquake analysis. A ratio method is also possible where the sum of the rates for a given period of years are divided by the sum of the rates for an equal period of years following.

In all categories the maximum likelihood method was lowest and the two point method the highest. All methods found the shortest half life for the other category. The range for mainstream journals was from 5.25 to 8.15 years. Forty three percent of the publications appeared in mainstream journals but they drew 71% of all citations. Differences in half lives are meaningful only if calculated in the same manner and are fairly large, say a factor of two. The ratio method is sound where the data fit a decay model, but the median life method is useful where it does not, if sufficient data over time are available.

701

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Urquhart's and Garfield's Laws: The British Controversy Over Their Validity
    
Stephen J. Bensman
     Published online 9 May 2001

Bensman, prior to his own reanalysis of the data, takes an historical look at the controversy over the apparent conflict between Urquart's contention that interlibrary loan demand for a journal is a measure of its total use and Garfield's contention that a small core of highly cited journals will provide the necessary communication channel for science. In 1976 Garfield's citation data was compared by Scales to National Lending Library for Science and Technology's use survey data for 1967 to 1969 excluding any titles not in the SSCI source journals using the Spearman rank correlation coefficient. It was necessary to consider 250 titles before 50 common titles occurred and most correlations were not significant. These results were criticized on the basis that the lending library's loans were for material typical libraries did not collect, and were thus unlikely to be for core materials, but Urquart and Line argued that borrowing was from libraries that were not University libraries but special libraries that did not buy the core. Line later rejected any generalized use study whether based on citation or lending as not reflecting unique local needs. Brooks' statistical analysis of the Scales study pointed out the inappropriate design and left the question unanswered.

Bensman classes the journals on both lists into the SCI subject categories of 196569. NLLST use was technology oriented while SCI use was more toward basic science. Using Bradford like zones he shows journals could easily change ranks with new data but changing zones would be more unlikely. Chisquare tests show a clear dependence between the between the two data sets and evidence that the methods are not in opposition.

714

 

 

 

 

 

 


 

 

 

 

 

 


Totality and Representation: A History of Knowledge Management Through European Documentation, Critical Modernity, and PostFordism
    
Ronald E. Day
     Published online 15 May 2001

Day attempts to set the current knowledge management specialization into its twentieth century historical context, indicating that its current popularity is a symptom of ongoing attempts to manage knowledge in capitalist society. Traditionally this has been seen as an attempt to capture and organize those thoughts not directly linked to daily operations but produced as a kind of surplus. Modern knowledge management aims to appropriate the workers knowledge and the consumer's wants and needs into the structure of the organization. The context begins with Otlet's concept of a world brain; collected knowledge for the creation of world peace, followed by Briet's idea that knowledge can exist outside of documents and as a resource not for peace but for scientific and industrial production. These thoughts were criticized by Heidegger for whom language and context produce knowledge and so the act of trying to transmit knowledge will effect it. Day sees knowledge management as a product of postFordism, a management system in effect after the assembly line system, using a well paid semiskilled worker in machine production, was shifted to the new or knowledge economy, where knowledge of the consumer is crucial for success in shaping markets. Since production has moved onto the social and linguistic resources of workers, what was once a private sphere of knowledge is now the subject of capital for production. How the organization achieves control of such resources is the domain of knowledge management.

    725

 

 

 

 

 

 

 

 

 

 

 

 

Google's Web Page Ranking Applied to Different Topological Web Graph
Structures
    
George Meghabghab
     Published online 22 May 2001

After showing that Web page relationships can be represented by a graph structure based upon their links, Meghabghab plots Web pages both by the intersection of their number of outgoing links to unique pages and by their number of source unique incoming links. This permits the computation of a distance between different web page graphs and the application of the Google Web page ranking algorithm. He shows that equal Google rank will be assigned to nodes at the same level in a complete bipartite graph, in outdegree trees, and in indegree trees, but that this algorithm will not work well on bipartite or general graphs.

    736

 

 

 

 

 


 

 


Toward Automatic Chinese Temporal Information Extraction
     Wenjie Li, KamFai Wong, and Chunfa Yuan
     Published online 25 May 2001

Li, Wong and Yuan are concerned with the extraction of information concerning the timing of events from Chinese text that will reveal not only explicit but implicit temporal information. They study Hong Kong financial news reports in an attempt to formulate a formal representation of temporal knowledge. The single Chinese verb form gives no clue as to tense, but tense may be determined from associated temporal adverbs and aspect auxiliary words. One can identify verbs that are either static(reflecting a continuing existing state) or dynamic (reflecting a change of state), either durative (exhibiting a starting and an ending point separated by observable time) or instantaneous (persisting for a very short duration) or telic (stopping at a point where change is complete) or nontelic (exhibiting no definite indication of a stopping point) although this may be a factor of their objects rather than their pure semantics. If this information is extracted from a text along with any implicit or explicit temporal expression it can be used to fill frame slots providing a temporal description. Text is parsed, noun phrases and other components identified, the verb used to look up its properties and any aspect words identified. The classed verb and any temporal noun phrase provide the temporal relations to fill the frame slots for the text. The framed information can then be used to answer temporal questions.
 

748

 


 

 

 

 

 

 

 

 

 

How to Analyze Publication Time Trends by Correspondence Factor Analysis: Analysis of Publications by 48 Countries in 18 Disciplines over 12 Years
     JeanChristophe Dore and Tiiu Ojasoo
     Published online 24 May 2001

Using an ISI database Dore and Ojasoo annualize publication activity in 48 countries and 12 disciplines over a twelveyear period from 1981 to 1992 using the multidimensional technique of Correspondence Factor Analysis. CFA produces a ranked series of orthogonal factorial axes from processed variables. Chi2 distances from the center of gravity of the output per country per year are plotted, giving a picture of how each differs from the combined or world output. The analysis discloses the priority given to particular disciplines at particular time periods, and how disciplinary effort shifts geographically over time.
 

763

 

 

 


 

 

 

BRIEF COMMUNICATIONS

 

 Who Dunnit? Metatags and Hyperauthorship
    
Elisabeth Davenport and Blaise Cronin
     Published online 9 May 2001

A threetier hierarchy of level of author contribution for high author papers is presented by Davenport and Cronin as a possible way of rationalizing the description of an author's participation in the creation of such a work. An XMLlike markup scheme required by granting agencies is suggested as the means of implementation.
 

770

 

 

 

 

 


Towards a Theory of Aboutness, Subject, Topicality, Theme, Domain,
Field, Content  and Relevance
    
Birger Hj\orland
     Published online 24 May 2001

Finally, Hj\orland's discussion is concerned with proper definition of terms that describe what is being analyzed, when what is known as subject analysis takes place. He points out different usages in the literature and suggests those that he believes to be correct..
 

774


 

 

 

     
 

BOOK REVIEWS

 
 

Change Management to Information Services, by Lyndon Pugh
     Sara R. Tompson
     Published online 25 May 2001
 

779
 

 

 

Blown to Bits: How the New Economics of Information Transforms Strategy, by Philip Evans and Thomas S. Wurster
     Lonnie L. Johnson
     Published online 25 May 2001

780


 


ASIST Home Page

Association for Information Science and Technology
8555 16th Street, Suite 850, Silver Spring, Maryland 20910, USA
Tel. 301-495-0900, Fax: 301-495-0810 | E-mail:
asis@asis.org

Copyright 2001, Association for Information Science and Technology