Library Technology Guides

Document Repository

Assessing the academic networked environment

ARL: A Bimonthly Newsletter of Research Library Issues and Actions [April 1998]

.

Copyright (c) 1998 Association of Research Libraries


How are libraries thinking about assessment in the networked environment? Is progress being made in measuring the impact of the availability of networked information resources and services? Is any attempt being made to demonstrate the impact that the investment of many hundreds of thousands of dollars has made on the improvement of access to information by those in the university community?

Seven institutions are participating in a Coalition for Networked Information (CNI) project on assessment, which was developed as an outgrowth of the publication Assessing the Academic Networked Environment: Strategies and Options, by Charles McClure and Cynthia Lopata (CNI, 1996). The manual describes the challenges of assessing networks and networked services and offers guidance on approaches to developing measures. The authors describe sample measures in a variety of areas. The institutions participating in the CNI project chose areas of assessment for their particular campus and developed measures using the McClure/Lopata manual as a starting point.

Several of the participating institutions tested measures related to library and information resources and services as part of the CNI project. Their initiatives are tailored to the needs of their own institutions, have distinct flavors, and employ a range of assessment techniques. While the first round of implementation of measures is not complete on all campuses, the following summaries of several of the efforts provide information on the kinds of topics the libraries are measuring and report some of the initial findings.

Reports from each of the institutions' initiatives and supporting materials, including in many cases the surveys and other instruments used, are available on CNI's website at: www.cni.org/projects/assessing.

University of Washington

The University of Washington has an ambitious program of assessment initiatives, including redesigning their triennial library use survey to include a focus on networked information, continued development of evaluation methods for the UWired teaching and learning program, and an examination of faculty and graduate student information use.

The UWired assessment plan is a collaborative effort of the Undergraduate Education, University Libraries, and Computing and Communications departments. Evaluation efforts include the use of a variety of techniques, including printed surveys, web-based surveys, e-mail questionnaires, and focus groups.

To determine how faculty and graduate students in the biological sciences are using information for their research and teaching activities, the University of Washington team is conducting focus group sessions structured around three areas:

  • How users identify, obtain, and use information for research and teaching activities.
  • The ways users would ideally like to get information they need and why.
  • The use of and perceived tradeoffs associated with electronic journals.

The team identified enablers and obstacles to their work. Enablers included strong support from the administration for some of the assessment work, previous experience with surveys and data analysis, and the high priority placed by staff on this work. Obstacles included the difficulty of developing effective performance measures in a very dynamic and complex environment, the time-consuming nature of most assessment activities, and the difficulty in getting usage statistics from vendors of information products.

The University of Washington is also taking advantage of the services offered by the Flashlight Project.

Virginia Tech

The “moving target” of networked information measurement is also an issue on the Virginia Tech campus. In her report on the project, Dean of University Libraries Eileen Hitchingham writes, “We look at the changing realities of a few months ago to make best guesses about why things are happening today, or to better understand what might happen in the near future. The perspective is speculative, not conclusive. Still, making guesses from some information seems better than working with no information.”

The Virginia Tech assessment measures included a student survey that was developed by a number of campus units, including the library. Staff asked students to describe their use of links on the library's main web page. Students reported use of the electronic reserve system, institutional library catalog, regional and special library catalogs, and list of database resources. For this group of students, use of the library's more than 100 electronic journals was disappointingly low.

Two of Virginia Tech's measures addressed the physical location of students and others using library and information resources. The survey found that many of the students visited the library in person as well as used the resources through remote network connections from dorm rooms, off-campus housing, etc.

In a study that included a review of web log data, the team examined where users were located when connecting to the library web pages and determined that less than half were inside the actual library. An intensive analysis of the use of library web pages was difficult given the many changes taking place with the pages during the short period of time the log data examined. Also, the fact that different library network services reside on different servers made collecting data difficult. However, understanding where users are physically located when using networked information resources has implications for providing services and instruction for library users and is useful for future planning.

Gettysburg College

At Gettysburg College, the assessment project focused on the use and cost-effectiveness of the Electronic Reserves System. The Electronic Reserves System is part of a broader Curriculum Navigation Project (CNAV) that provides a central source of information for the campus. Via CNAV, students can access information about their courses, including class rosters, course homepages, course syllabi, and electronic reserves. Access to electronic reserves materials is restricted to those enrolled in the course.

Through both telephone and electronic surveys, Gettysburg assessed why students used or did not use the electronic reserves system, determined usage patterns, and examined both faculty and student satisfaction with the electronic reserves system.

As was evident at Virginia Tech, many students access library resources from their dorm room if that access is available; well over half of the Gettysburg students using the electronic reserves system did so. Most students found the system to be convenient and easy to use.

Faculty were enthusiastic about the electronic reserves system because of the added value it brought to their courses. In particular, they valued the ability to easily make available current materials to students enrolled in their classes, and they liked the capability of allowing many simultaneous users to access course reserves, since that is the frequent pattern of use of such materials. In addition, they liked the reports they received documenting what portion of the class actually accessed each reserve item and how many repeat uses of items were recorded.

As one faculty member stated, “I truly believe that my students had access to more timely and accurate literature through using electronic reserves.... Electronic reserves helps me do a better job of providing good readings to my students as well as monitoring their use of them.” Another wrote, “[Electronic reserves] provided all students with instant and continuing access to course materials. Electronic access far outstrips the traditional reserve system for providing access, especially in a high enrollment class. Also, some reserves were... needed for long-term access.”

King's College, London

King's College is focusing on two issues of relevance to libraries: electronic journals and the use of electronic vs. printed information. They are collecting data by electronic means when possible.

In exploring the topic of electronic journals, the King's team is gathering data on use, cost per transaction/user, usage profiles by journal and by department, and system availability. They are also seeking qualitative data on reasons for use, user satisfaction, and ease of administration. To collect this information, they are examining system logs and administering web-based surveys.

Their analyses of the use of electronic vs. printed information also includes quantitative data on usage, cost per item/user, use profiles by department, and document availability. Their qualitative assessment addresses reasons for choosing print or electronic information, user preference, user satisfaction, and ease of administration.

The driving forces behind King's assessment project are the need for accountability to users and funding bodies and the desire to improve services where needed. As most of the higher education institutions in the U.K. are making heavy use of and investing in electronic resources as part of the eLib Programme and other efforts, the institutions want to know if those investments are paying off to users and in what ways users are satisfied or dissatisfied with the networked information resources and services.

Challenges of Implementing Assessment Programs

In their manual, McClure and Lopata state some criteria by which they will judge the impact and success of their publication. They include whether:

  • campuses will experiment with assessment techniques;
  • campuses will share information and insights on how assessments can be done more effectively;
  • evaluation research concepts and procedures will move forward in this area;
  • campus decision makers will be able to design and plan more effective networked environments; and
  • data generated will promote incorporation of users' viewpoints into the way the network evolves.

Through the CNI project, a small number of institutions have taken up this challenge. For most of the institutional teams, the road has not been easy. The time and resource investments have been significant and the ever-changing networked environment makes some techniques problematic. However, in some cases there is a strong institutional mandate for the development of assessment measures throughout the university, and in others, there is strong commitment by unit heads to work towards improving services using assessment as a diagnostic tool. In addition, the project provided a mechanism for individuals from many units on campus to coordinate assessment efforts in relation to networks and networked services.

Project team members report that strong support for assessment from top campus and unit administrators has had a positive impact on the amount of resources available for assessment efforts. The surveys and data collection efforts by the institutions involved have enabled them to get a first look at the impact of electronic information services on users and to begin answering the question, “What difference do these electronic resources and services make to users?” The institutional teams have found that a variety of data collection techniques can be useful, from log analysis to user surveys (both print and on the Web) to focus groups and individual interviews. The project has given the institutions experience with a set of tools and a start in establishing a baseline of data on electronic resources and services use for their campuses.

CNI has received support for this project from Indiana University; and Christopher Peebles and his staff have been instrumental in the project's implementation. Charles McClure has been a guiding force in this phase of the project and provided its initial inspiration. In addition, CNI has received support from the Council on Library and Information Resources (CLIR).

Data on Use, Quality, and Costs of Network Services

Christopher Peebles, CNI Visiting Fellow and Associate Vice-President and Dean of Information Technology at Indiana University, has developed an impressive set of survey data that describes use and user satisfaction with an array of services, including IT user support, hardware and software, and e-mail. The materials he uses in his presentations are available at: http://www.indiana.edu/~ucsdcas/jm/ .

To view nine years of Indiana University IT quality surveys, visit: http://www.indiana.edu/~uitssur/

To view the Activity Based Cost data for the central IT organization at IU visit: http://www.indiana.edu/~ucs/business/scindex.htm

Assessing the Academic Networked Environment: Strategies and Options (CNI, 1996) is available at: http://istweb.syr.edu/~mcclure . To order a print copy, send a check for $15 to CNI Publications, Department #0692, Washington, DC 20073-0692.

The Flashlight Project

The Flashlight Project, headed by Steve Ehrmann of the TLT Group, provides a suite of evaluative tools, training, consulting, and other services. The work is based on the Flashlight Current Student InventoryTM (CSI), which can be used to collect facts and opinions from currently enrolled students. The CSI is a tool kit of almost 500 indexed questions that can be used to draft surveys, questionnaires, and protocols for interviews and focus groups.

Sample Flashlight questions and a more detailed description of the CSI are posted at: http://www.tltgroup.org/programs/flashlight.html.

Permalink:
View Citation
Publication Year:1998
Type of Material:Article
Language English
Published in: ARL: A Bimonthly Newsletter of Research Library Issues and Actions
Publication Info:Volume 197
Issue:April 1998
Publisher:Association of Research Libraries
Place of Publication:Washington, DC
Notes:Joan Lippincott is Associate Executive Director, Coalition for Networked Information. Part of Special Issue on Measures
Subject: Electronic Resources -- measuring use
Online access:http://www.arl.org/newsltr/197/assess.html
Record Number:9669
Last Update:2012-12-29 14:06:47
Date Created:0000-00-00 00:00:00