Assessing the Government Information Locator Service (GILS): A Multi-Method Approach for Evaluating Networked Services


William E. Moen

School of Library and Information Sciences, University of North Texas, Denton, Texas 76203


Charles R. McClure, Distinguished Professor

School of Information Studies, Syracuse University, Syracuse, NY 13244


June Koelker, Erin Stewart

School of Library and Information Sciences, University of North Texas, Denton, Texas 76203





This paper describes a multi-method approach used to evaluate the Government Information Locator Service (GILS) and discusses the complexities in assessing a multi-faceted networked information service. Federal agencies are implementing GILS as an agency-based, network-accessible locator of Federal information resources. The architecture of GILS includes actual digital collections of government information resources, metadata describing those resources, human intermediaries, technical standards, government-wide and agency policy, users, and various information technologies. The investigators conducted a nine-month study from September 1996 to May 1997. The study provides findings and recommendations to improve the overall access to and use of F ederal information as well as offer language to improve information policy related to the management of and access to government information. The current study highlights the limitations and opportunities of available approaches to evaluating the complex characteristics of networked information services and digital collections.





This paper describes the research methodology used in an evaluation of U.S. Federal agency implementations of the Government Information Locator Service (GILS) (Moen & McClure, 1997). The paper discusses the multiple data collection and analys is techniques used in the study. The study focused on users of and participants in GILS implementations which occurred in an environment that is administratively decentralized and which assumed a distributed computing, networked environment for informati on access and dissemination. The study began in September 1996 and concluded in May 1997. Additional information about the project can be found on the GILS Evaluation Project Web Site <>.


The paper offers background information about GILS, discusses selected literature concerned with research methodologies proposed or being used in environments similar to GILS, and details the study’s research framework including individual data collect ion techniques and the multi-method approach used by the investigators. The paper argues that complex networked services require appropriate multi-faceted research approaches to adequately evaluate those services. The intention is to contribute to the o ngoing discussion of evaluation of networked services and resources (e.g., digital libraries).





The Government Information Locator Service (GILS) is intended as a tool to assist Federal agencies in carrying out information dissemination as well as broader information resources management responsibilities. Under the Paperwork Reduction Act of 199 5, Federal agencies were to inventory, identify, describe, and locate information resources using GILS (U.S. Congress, 1995). GILS embodies a metadata approach to information management and uses specific data elements in a standardized record structure t o describe agency information resources (Christian, 1996). GILS calls for a standardized description of information products and serves as a means to aid the general public in accessing government information within a distributed network environment (Moen & McClure, 1996).


GILS is intended to address the basic need of citizens, government employees, and others to identify, locate, and access or acquire government information. The architecture of GILS includes actual digital collections of government information resources , metadata describing those resources, human intermediaries, technical standards, government-wide and agency policy, users, and various information technologies.


The investigators completed previous studies related to the current GILS initiative that emphasized the importance of Federal locators for supporting public access to government information. The following studies provided background to and informed the current evaluation effort:






The Office of Management and Budget’s Bulletin 95-01 established the Government Information Locator Service and required Federal executive agencies to compile inventories of automated information systems, inventories of Privacy Act systems of records, and inventories of locators of information dissemination products by December 31, 1995 (Office of Management and Budget, 1994). Agencies were to use these inventories to create initial GILS locator records, which would be available online to the public by the same date.


For the current evaluation study, the investigators anticipated that the GILS implementation process differed from agency to agency. Each agency had its own type and quantity of resources to be described through GILS. Additionally, each agency has its own technological infrastructure, individual administrative expertise, and financial resources to implement such a service. These factors, along with an understanding of agency culture, affected each agency’s readiness to implement GILS. In part, the stud y examined how GILS implementations assisted agencies in fulfilling their information dissemination and information management responsibilities.


To evaluate effectively a process which varies substantially from agency to agency, the investigators designed a research framework and methodology robust enough to address multiple aspects of a networked information service such as GILS. A review of r esearch methodology literature in the area of networked information services aided the investigators in the design of the evaluation methodology.





Recent literature about evaluating networked information services offered insight into evaluation methodologies but the review indicated that the area is less than fully developed. Research methodology literature in the areas of networked information s ervices, government use of networked information resources, assessments of free-nets, and the six NSF/ARPA/NASA Digital Library research projects provided information of interest to the investigators.


McClure (1991) emphasized the need for user-based techniques rather than system-driven techniques for evaluating networked information services. These techniques take into account "the particular communication behavior, information use patterns, and work environments of potential users." McClure (1994) recommended four factors on which to evaluate networked information services: extensiveness, efficiency, effectiveness, and impact. Specific techniques recommended were the use of focus gro ups, user logs, network-based data collection techniques, interviews, surveys, and site visits. Further research (McClure & Lopata, 1996), specifically in the academic networked environment, resulted in guidelines and suggestions that highlighted th e value of using natural settings to more accurately assess the networked information service.


Networked information services described by Bertot and McClure (1996) match the GILS environment in that there are multiple providers of the services, a range of information services available, growing use and access of the services, and a rapidly chan ging environment. Criteria for evaluating networked information services include service quality, usefulness, and the four factors previously cited by McClure (Bertot & McClure, 1996).


GILS is an example of a networked information service that occurs within a governmental setting. Bishop & Bishop (1995) highlighted the importance of user studies of networked information services for government accountability and effectiveness. They recognized that user studies need to reflect the complexity of human behavior and recommended new models for successful collaborations among users, social science researchers, and network decision makers.


User studies of free-nets were of interest because these types of distributed networked information systems offer similarities to GILS. Newby & Bishop (1996) documented the methodology used to assess Prairienet in Champaign, Illinois. This report used descriptive statistics with Web server transaction logs to identify characteristics of the users who access Prairienet. Patrick (1996) described the methodology used in a user survey of the National Capital FreeNet in Ottawa, Canada which included a self-selected user survey and a "random encouragement" survey.


Analysis of transaction log files offered another avenue for evaluation research. Noonan (1996) described the use of web usage statistics and listed four reasons for government agencies to be interested in these sources of data. By analyzing web usage statistics, agency staff can demonstrate accountability, collect data to improve service, reach new audiences, and offer informative and useful means to disseminate information about the agency. The study offered the investigators practical guidelines for analyzing four common web transaction log files: access, error, referrer, and agent.


The six Digital Library research projects funded by the National Science Foundation (NSF), Advanced Research Projects Agency (ARPA), and the National Aeronautics and Space Administration (NASA) initiated a research stream helpful for evaluating distrib uted networked information services comparable to GILS. At the 1995 Allerton Institute, "How We Do User-Centered Design and Evaluation of Digital Libraries: A Methodological Forum", Bishop (1995) summarized the breadth of methodological issues a ddressed by the Digital Library Project researchers. She identified the data gathering techniques used by digital library (DL) researchers including log analysis, protocol analyses of user sessions, focus groups, in-depth interviews, user surveys, control led observations with videotaping, collection of user comments and feedback, questionnaires, and written evaluations of testbed systems.


The Allerton Institute (1995) offered examples of research studies with methodological relevance to GILS evaluation efforts. At the Institute, Van House (1995) discussed user needs assessment and evaluation for the University of California - Berkeley’s NSF/ARPA/NASA Digital Libraries Project. She identified three methodology areas which are "predecessor" in nature to digital library research: library evaluation with its focus on users’ needs as the basis for evaluation, user-centered system d esign with its incorporation of user needs into system design, and usability analysis with its feedback methods.


Buttenfield’s (1995) study, "User Evaluation for the Alexandria Digital Library Project", emphasized factors which researchers encounter when planning distributed network information services focusing on spatial data, that is a subset of mate rial accessible through GILS. Methodological issues for this project include targeting specific user classes, the lack of appropriate spatial metadata models, and a lack of understanding of user requirements. Both Van House and Buttenfield’s work support methodological assumptions of the GILS evaluation project since the GILS evaluation also focused on user needs, on incorporating user needs into system design, on usability analysis, and on the need to target specific classes of users to determine user re quirements.


After a review of methodologies used by the digital library (DL) research community, three conclusions emerge. The DL community uses both qualitative and quantitative data gathering techniques; it is this multi-method strategy which was of interest to the GILS investigators. The DL community is using user-based methodologies both to design DLs and to evaluate the early testbeds. For the GILS study, the technology to support locator services has already been established and the primary focus of the meth odologies is on evaluation rather than design; one likely outcome of the GILS evaluation, however, may be the identification of additional user-based requirements for enhancing the design an implementation of GILS. Finally, DL research is contributing to the development of user-based methodologies for networked information services; the GILS study is evaluating an existing complex networked service and will add to these methodologies from practical research of an operational networked service.


The review of selected, recent methodology literature on evaluation of networked services clearly identifies such evaluations as an area under development. The investigators determined that the use of multiple methods to gather data is an emerging area of research methodology for evaluating networked information services. In addition, a focus on user needs is central to many of these studies. The research community is showing keen interest in developing new assessment strategies for evaluating networke d information services.





To accommodate multiple facets of GILS, the investigators designed an evaluation framework that would guide a holistic research approach to the evaluation. The framework identified five dimensions: Technology, Content, Users, Policy, and Standards and Rules. The research framework also includes three perspectives, representing the "views" of various stakeholders in GILS: Users, Agency, and Government-wide. The three perspectives helped to focus the evaluation on the need to represent diff erent views held by different stakeholders during implementation and use of a networked-based information service. Together the three perspectives and the five dimensions capture the complexity of GILS as a networked-based information service and guided r esearch design and data collection activities. Figure 1 presents the evaluation framework.



Figure 1. Framework for GILS Evaluation: Perspectives & Dimensions




The research framework highlighted the need for multiple and sometimes simultaneous data gathering methods and activities which could inform the five dimensions and three perspectives as shown in Figure 1. The investigators used a variety of individual data gathering techniques as described below.




To capture the richness and multi-dimensionality of GILS, the investigators identified a variety of data gathering and analysis techniques. Since an evaluation of a networked-based information service such as GILS needed to examine diverse factors (e.g ., nature and type of resources to be described by locator records, agency resources available, etc.), the investigators needed diverse but complementary data gathering techniques to capture as fully as possible the breadth and depth of issues involved.


The investigators matched research information needs (i.e., information needed about each dimension of GILS) with appropriate quantitative and qualitative research techniques (Creswell, 1994). Investigators selected and utilized one or more methods on the basis of satisfying the information needs of each component of the study. As an example, site visits to agencies allowed the investigators to interview many agency staff to fully realize all aspects of an agency’s usage and implementation experiences with GILS from various participant perspectives. In a parallel manner, focus group sessions with various type of GILS stakeholders represented opportunities for the investigators to bring together homogeneous groups of stakeholders to represent common-int erest perspectives.


These methodologies used theoretical sampling rather than statistical sampling. Unlike the latter, which is designed to provide data subject to statistical verification, theoretical sampling allowed capture of incidents of difference and, in a progress ive fashion, built a broad foundation for subsequent analysis. (Glaser & Strauss, 1967)


Table 1 summarizes the data collection techniques used in the study. Each technique is associated with one or more primary methods (i.e., qualitative or quantitative), the kind of information obtained, and the form of the resulting data. The resulting data were then analyzed using appropriate analytical techniques.


Table 1. Table Of Data Collection Techniques



Primary Method

Information Obtained

Forms of Data


Site Visit


Agency-specific experiences as described by various agency staff as participants in GILS

Narrative Text

Content Analysis

Focus Group


Stakeholder-specific perspective on GILS

Narrative Text

Content Analysis



Quantifiable assessments of key GILS issues

Numeric Data

Descriptive Statistics

Record Content Analysis


Measurement and assessment of GILS record quality

Numeric Data

Descriptive Statistics

Transaction Log Analysis


Machine-generated data of users’ interaction with GILS

Numeric Data

Descriptive Statistics

Scripted User Assessment

Qualitative, Quantitative

User’s assessments of GILS as a networked service

Narrative Text, Numeric Data

Content Analysis, Descriptive Statistics

Policy Review


Analysis of the policy environment and specific policies providing the context for GILS

Narrative Text

Content Analysis


The following briefly describes each technique, how it was used, and its utility in the evaluation study:


Site Visits Investigators conducted one-day visits to agencies to observe specific environments of GILS implementations. Investigators carried out guided interviews with personnel from many administrative and functional areas. Site visits also i ncluded one focus group of agency staff, examination of relevant agency documentation, and tours/demonstrations. Site visits provided for detailed understanding—from participants perspectives—of GILS implementation issues.


Focus Groups Investigators conducted a series of "carefully planned discussion[s] designed to obtain perceptions on a defined area of interest in a permissive, non-threatening environment" (Krueger, 1988). Focus groups brought togethe r groups of stakeholders, allowing individuals with common interests an opportunity to explore shared beliefs and goals with respect to GILS.


Survey Investigators developed a survey instrument administered to participants of a national conference on GILS. Respondents assessed key GILS policy and other issues. The survey also provided assessments of conference participants’ knowledge of GILS policies, attitudes, and experiences as well as qualitative information concerning expectations and lessons learned.


Record Content Analysis Investigators developed a procedure for analyzing the content of GILS records. Investigators examined a sample of GILS records from 42 known GILS agency implementations. Specific tests were used to operationalize a set of criteria that included (1) the relative "completeness" or level of description of GILS records, (2) resource types, aggregation levels, and dissemination media, and (3) records serviceability (i.e., factors affecting Networked Information Disco very and Retrieval (NIDR), user convenience, aesthetics and readability, and relevance judgment).


Transaction Log Analysis Investigators developed a set of procedures and analyses to examine data from one agency’s web server transaction log files. The procedures generated data for statistical analysis of user transaction activity on a agency ’s GILS server.


Scripted Online User Assessment Investigators developed an exploratory method of scripted online user assessment to capture users (1) subjective appraisal of GILS efficacy as a tool for NIDR, (2) emotional and intellectual reactions to GILS prod ucts and services as compared with others, and (3) assumptions about GILS implementations (vis a vis coverage, information space, and authority) based on a limited first-exposure.


Policy and Literature Review Investigators completed a review of GILS and related policy instruments, regulations, laws and related literature to provide an understanding of the current environment that is the context for GILS implementations. T his review enabled the investigators to develop recommendations for changes and enhancements to policies—both government-wide and for individual agencies.


These techniques provided the foundation for data collection and analysis activities carried out in the study.





One or more of the techniques described above collected data related to each of the dimensions of GILS identified in the evaluation framework. The following sections describe briefly the scope of each dimension and identity data collection activities a ssociated with its exploration.




The dimension of Technology included technical implementation details such as access mechanisms and implications of certain technology choices by Federal agencies and policymakers. Data collection to explore the Technology dimension featured:


At the 1996 national GILS conference, investigators invited vendors and technologists to a focus group session to discuss both existing and future technology options for GILS. The session brought together a group of stakeholders whose views on GILS tec hnology included market potential, feasibility and desirability of future technological developments, and an evaluation of GILS functionality from a group of technology-informed users.


Site visits with IRM staff at selected Federal agencies enriched data gathering through use of personal interviews. Within different agencies, IRM and systems staff fulfilled a variety of roles as part of the process of implementing GILS as a networked information service. Investigators interviewed those agency staff who guided the GILS technical efforts. These interviews aided in an understanding of key issues, challenges, and critical success factors for the agency.


An additional data gathering technique included an exploratory log analysis activity designed to assist in the evaluation of GILS usage. Transaction analysis of log files from an agency’s GILS server provided the investigators with an important tool fo r understanding usage of a networked-based information service. Finally, the content analysis of GILS records elucidated a record creation and usage issue unanticipated in the GILS specification – the possible confounding effects of hypertext capabilities on metadata serviceability and maintenance.




The dimension of Content, at the macro-level identified the information resources included or covered in GILS, and at the micro-level, concerned the quality, degree of variance, accuracy, and usability of the information resource descriptions co vered by GILS. Data gathering techniques for this dimension included:



Investigators developed preliminary criteria and assessment methods to evaluate a sample of GILS Records. National Archives and Records Administration (NARA) The Government Information Locator Service: Guidelines for the Preparation of GILS Core Ent ries (National Archives and Records Administration, 1995) provided a basis for the development of the criteria. Agency GILS implementors used these guidelines in creating agency GILS records. To understand implementors’ decision-making with regard to record content, site visits to agencies included interview sessions with record creators. These interviews with staff who had personal involvement in the record creation process contributed important information on the strategies which shaped decisions ab out an agency’s GILS records. Focus group sessions, survey questions, and user assessment also provided the investigators with perceptions and perspectives on the usefulness and value of GILS records form different user groups.



The Users dimension concerned identification of GILS users: their needs, their usage of GILS, and their satisfaction with GILS. Data gathering techniques for this dimension included:



GILS users are not a homogeneous group, but rather consist of a variety of separate user groups including librarians, public citizens, record managers and other staff members at the implementing agencies, and state and local GILS implementors.


Agency site visit interviews included discussions with staff to learn about that agency’s efforts to involve users in the agency’s planning activity and the agency’s experiences with public use of GILS as an effective means to obtain government informa tion. Site visit interviews with agency staff who directly supported public access to government information also provided information on users’ perceptions of GILS. A number of the focus groups gathered information about specific groups of users such as records mangers, librarians, and public interest groups.


Users’ assessments of GILS were supported by means of the scripted online user assessment, which comprised a set of browsing and search/retrieval instructions designed to lead users into typical system output encounters. The content analysis of sampled GILS records, discussed above, informed the script writing in that the variance and variations of quality it revealed were built into the subjects’ controlled GILS experience. Users, comprising individuals with expressed interest in locating government i nformation and /or information management, answered a series of multiple-choice and short-answer questions as they executed the script. The script contained questions designed to gain an understanding of the user’s searching reflexes, the user’s expectati ons of database content and retrieval service, and the user’s comfort with or adaptation to the intellectual construct of metadata. This line of inquiry was considered exploratory by the investigators.



The Policy dimension of the evaluation framework identified the policy environment for the U. S. Federal GILS implementations. Policy occurs at both government-wide and agency levels. Data gathering events and activities for this dimension incl uded:


Investigators conducted a policy review of enabling legislation, executive orders, and other guidelines which represented formal information policy with respect to GILS. The review highlighted key policy issues as well as identified changes in policy s ince GILS’ inception in 1994.


Focus group sessions with Federal information policy stakeholders and site visit interviews with agency policymakers provided opportunities for important stakeholder groups to not only inform the investigators as to current and future policy goals in t his area but also to share among themselves mutual insights and concerns. Site visit interviews enabled the investigators to gain an understanding of an agency’s internal policy with respect to networked information resources. Finally, the survey included questions about respondents’ familiarity and understanding of information policy sources for GILS as well as assessments of existing policy guidance.

Standards and Rules

The Standards and Rules dimension addressed the utility of standards to ensure consistency in GILS information, and their use to support broader connection, access, and retrieval of information. Data gathering techniques for this dimension inclu ded:



Z39.50 is an American National Standard that specifies a protocol for information retrieval which enables interoperability between disparate information systems (National Information Standards Organization, 1995) Information systems that support this s tandard allow agency staff and the public to retrieve data from a variety of databases using one common interface and search process. The investigators interviewed administrators and IRM staff at Federal agencies to learn of their general awareness of sta ndards and specific use of Z39.50 within that agency’s implementation. The survey included questions designed to elicit respondents’ awareness and usage of standards.


It is important to note that the five dimensions of the evaluation framework and the multiple data collection techniques did not exist in isolation from each other. Multiple data collection techniques not only enabled the investigators to explore aspe cts of any one dimension from a variety of perspectives but also provided for exploration of the relationships and interaction of these dimensions. For example the question:


GILS policy presumes that user feedback will serve to inform and guide agencies’ continued development and refinement of the service. Is there evidence that this is taking place?


could be addressed from various dimensions including:



A question such as this shows how a multi-method research approach can be effective in supporting evaluation of a complex and multi-faceted networked information service.





The GILS evaluation served two primary purposes for the investigators. First, we devised and used an innovative research approach to explore the multi-faceted nature that we assert is not only characteristic of GILS but of digital libraries and other c omplex networked information services. Second, we developed and enhanced specific data collection techniques for use in the evaluation, and we combined these techniques in effective ways to understand and evaluate the current state of GILS implementations .


Researchers of networked information services and digital libraries must undertake vigorous and sustained evaluation and assessment until we have a better understanding of how users interact with these complex socio-technical phenomena. Preliminary fin dings from this assessment of GILS indicate that the original design and intent of GILS may need to be refocused based on actual user experiences and expectations. User-based assessments can be a countervailing force to the glamour and hype of the sophist icated technology that provides such vital ways of organizing and accessing information in the digital age.


Both the number and array of data-gathering techniques employed by the investigators reflect our desire to produce not only an integrated set of wide-angle and zoom "snapshots" of GILS but also a set of instruments useful to agencies when ass essing their own GILS implementations. It is anticipated that information policymakers as well as metadata creators will build on and refine the procedures specified for record content analysis, transaction log analysis, and scripted online user assessmen ts to serve tactical and strategic objectives for information resource management.


The final report of this study details findings and recommendations (Moen & McClure, 1997); however, preliminary analyses of data gathered to date indicate a surfacing of several key factors that influence the rate, ease, and payback of agency impl ementations. Leadership and vision, development of specific user-oriented objectives, understanding of information’s life cycle, and adaptation to changing circumstances—both in terms of policy and technology—appear to be among the pivotal characteristics . Attributes such as "vision" and "policy" are difficult to measure across organizational cultures, much less across time and across digital space.


The evaluation literature addressing digital libraries reflects the need for multi-method and multi-level assessment of complex networked information services (Bishop, 1995). GILS and its associated government information resources can be considered a digital library. GILS is also a complex information service existing within the larger networked information infrastructure. As such, the investigators anticipate the findings from this evaluation to underscore the complexity of the implementation, coordi nation, and utility of networked information services. Further, the researchers hope to contribute to evolving research methods appropriate to digital libraries in general and anticipate their use by other evaluators of networked information services and digital libraries.





Allerton Institute. (1995). How we do user-centered design and evaluation of digital libraries: a methodological forum. 37th Allerton Institute 1995. Graduate School of Library and Information Science: University of Illinois at Urbana-Champaign. <>


Bertot, J. C. & McClure, C. R. (1996). "Developing assessment techniques for statewide electronic networks." In Hardin, S. (Ed.) ASIS ‘96: proceedings of the 59th ASIS annual meeting (pp. 110-117). Medford, New Jersey: Information Today.


Bishop, A. P. & Bishop, C. (1995, Spring). The Policy role of user studies. Serials Review 21(1), 17-25.


Bishop, A. P. (1995, October). Working towards an understanding of digital library use: a report on the user research efforts of the NSF/ARPA/NASA DLI Projects. D-Lib Magazine. <Http:// 10bishop.html>


Buttenfield, B. P. (1995). User evaluation for the Alexandria Digital Library Project. http://edfu.lis.uiuc. edu/allerton/95/s2/buttenfield.html

Christian, E. J. (1996, December). GILS: What is it? Where’s it going? D-Lib Magazine. < dlib/december96/12christian.html>

Creswell, J. W. (1994). Research design: qualitative & quantitative approaches. Thousand Oaks, California: Sage Publications.

Glaser, B. G. & Strauss, A. L. (1967). The discovery of grounded theory: strategies for qualitative research. New York: Aldine de Gruyter.

Krueger, R. A. (1988). Focus groups: a practical guide for applied research. Newbury Park, California: Sage Publications.


McClure, C. R. (1991). Planning and evaluation for the networked environment. EDUCOM Review 26 (3-4), 34-37.


McClure, C. R. (1994). User-based data collection techniques and strategies for evaluating networked information services. Library Trends 42(4), 591-607.


McClure, C. R., Bishop, A., Doty, P., & Bergeron, P. (1990). Federal Information Inventory/Locator Systems: from burden to benefit, final report to the General Services Administration Regulatory Information Service Center and the Office of Manag ement and Budget, Office of Information and Regulatory Affairs. Syracuse, New York: Syracuse University, School of Information Studies. (ERIC Document Reproduction Services ED 326247).


McClure, C. R. & Lopata, C. (1996). Assessing the academic networked environment: strategies and options. Washington, D. C.: Coalition for Networked Information.


McClure, C. R., Ryan, J. & Moen, W. E. (1992). Identifying and describing federal information inventory/locator systems: design for networked-based locators. 2v. Bethesda, Md: National Audio Visual Center (ERIC Document Reproduction Service ED 349031).


Moen, W. E. & McClure, C. R. (1997). Final Report: An Evaluation of the Federal Government’s Implementation of the Government Information Locator Service (GILS). Denton, Tx: University of North Texas, School of Library and Information Scienc es, forthcoming.


Moen, W. E. & McClure, C. R. (1996, September 26). An Evaluation of the Federal Government’s Implementation of the Government Information Locator Service (GILS): Work Plan. Denton, Texas: University of North Texas, School of Library and Info rmation Sciences. < gilseval.htm.>


Moen, W. E. & McClure, C. R. (1994). The Government Information Locator Service (GILS): expanding research and development on the ANSI/Z39.50 information retrieval standard, final report. Prepared for the United States Geological Survey and the Interagency Working Group on Data Management for Global Change, Washington, DC [USGS, Cooperative Agreement No., 1434-93-A-1182]. Bethesda, MD: NISO.


National Archives and Records Administration. (1995). The Government Information Locator Service: guidelines for the preparation of GILS core entries. Washington, D. C.: National Archives and Records Administration. < documents/naradoc/>


National Information Standards Organization. (1995). ANSI/NISO Z39.50-1995. Information retrieval (Z39.50): Application service definition and protocol specification. Bethesda, MD: National Information Standards Organization Press. <http://lc>


Newby, G. B. & Bishop, A. P. (1996). Community system users and uses. In S. Hardin (Ed.) ASIS ‘96: proceedings of the 59th ASIS annual meeting (pp. 118-126). Medford, New Jersey: Information Today.


Noonan, D. (1996). Making sense of web usage statistics. The Piper Letter: Uses and Users. <http://www.>


Office of Management and Budget. (1994). Establishment of a Government Information Locator Service (OMB Bulletin No. 95-01). Washington, D. C.: Office of Management and Budget.


Patrick, A. S. (1996). Services on the Information Highway: subjective measures of use and importance from the National Capital FreeNet. Ottawa, Ontario: Government of Canada. <


U.S. Congress. 104th Congress, 1st Session. (1995) Paperwork Reduction Act of 1995: Public Law 104-13, 104th Congress, 1st Session.


Van House, N. (1995). UC Berkeley’s NSF/ARPA/NASA Digital Libraries Project. SIGOIS Bulletin 16(2), 18-19.