I think it's incredibly important for libraries and other organizations to constantly strive to create a web presence that's in tune with the needs and expectations of their users. As we implement a new design for the website or add in new technology components, it's essential to accumulate evidence that gauges whether any major change results in a positive or negative impact.
We can use many different techniques to help a library measure how the technologies it implements live up to user expectations. One approach might be to conduct surveys and focus groups. These gather information from representative samples of the library's users about their interest in a new product or in proposed changes. While that's a great way to collect information, and one that I hope all libraries employ, I want to focus on techniques that assess the technologies' performance based on the analysis of use data.
One key point about our websites is that we have few opportunities to personally interact with the users. Sure, we might be able to intercept some of those folks who use our sites from within the library. We can also solicit students from around campus to participate in focus groups or usability studies. Most users, however, visit from remote locations, get what they need (or not), and move on. Few respond to online surveys. How, then, can we obtain information from patrons about their use of our web-based resources?
We can make at least some judgments about the technologies that resonate with patrons based on what they choose to use within our web environments and what they ignore. I believe that organizations can follow a kind of digital archaeological approach to determine what users think of their websites by examining actual use patterns. This method relies on careful analysis of any tracks left behind as each user navigates in and out of the site. Sources of this data might include web server logs or the logs and statistical reports from applications such as online catalogs, search engines, federated search tools, or any other applications with interfaces that the public uses.
I've found that web servers and the applications that deliver content and services through the web create enormous amounts of data that are relevant to describing user experiences. The hard part is finding time to process and analyze that data and then turn it into meaningful information that can inform decisions about our future directions.
When it comes to the marketplace, it's said that people vote with their pocketbooks. On the web, each keystroke and mouse click-or its absence-registers as a vote. Every path that a website user selects contributes to a body of evidence that describes the site's overall usability and usefulness. The challenge for the creators is to discover the patterns that reveal the site's strengths and weaknesses and the appeal of the resources contained within.
Establishing the Baseline
In order to determine whether a site's new service or design change has a positive or negative impact, it's important to have a good understanding of what constitutes normal patterns of use. I try to look at the general levels of activity for each of the sites that I maintain at least once a week. This way, I have a good understanding of the typical levels of traffic per day and the peaks and valleys of activity according to the time of day and the days of the week. I've learned to expect, for example, that traffic will build through the week, fall off some by Friday, and dramatically decline over the weekend. The sites that target users from the academic arena follow distinct seasonal patterns. Activity peaks toward the last third of the semesters, drops precipitously during exam periods, and tracks at a moderate level during the summer.
Some of the other patterns that I look for include the average time for a user session, which individual pages get the most use, the number of items viewed per session, which pages are most used to enter the site, and which pages are common exit points.
Some of the most valuable infoiTnation reveals how folks got to the site. Most analytics pages are able to capture the referral data that indicates what site led each user to your site. If that site is a search engine, it's possible to reconstruct the query that the user entered to find your site. I find it fascinating and well worth the time to study the search engine queries that drive users to a given website.
Tools of the Trade
There are lots of tools available to measure website use. The grouping of web analytics applications includes a wide range of products to generate reports and perform various types of analysis. They turn the raw data of web server logs into meaningful information about how users approach your site. Most applications rely on the log files that web servers generate. If you have access to your server's log files, you can generate reports based on any historical period, provided that you haven't deleted the logs. You can easily find some free, open source web analytics packages. Others, aimed at the higher-end ecommerce arena, can cost tens of thousands of dollars.
I use Google Analytics to help me track the websites that I maintain, which include Library Technology Guides (www.librarytechnology.org), the Vanderbilt Television News Archive (http://tvnews.vbe.proxy.library.vanderbilt.edu), Electronic Tools and Ancient Near Eastern Archives (www.etana.org), and various other resources for the Vanderbilt University Libraries.
Google Analytics (www.google.com/analytics) is a free service with a very sophisticated set of features. It works by inserting a JavaScript snippet on each page that registers every web page impression into the Google Analytics database. I was initially a bit reluctant to use this approach because I was concerned there might be some intrusion into the user's privacy. After some investigation, I eventually realized that no personally identifiable information was being recorded or transmitted and that utilizing this tool did not violate the privacy of the site's users.
To get started with Google Analytics, you must have a Google account, perform some basic configuration details, and register each site that you want to monitor. You'll need to download to each of the servers a small text file that contains your Google Analytics security key. This ensures that you are authorized to monitor the activity. Next, you'll need to add the JavaScript snippet to each page of your site. If you use a content management system or at least use server-side includes for headers or footers, you can add the snippet globally. Once those steps are complete, the data collection process commences, and you're ready to begin using the Goggle Analytics interface to view activity. After a week or so, you'll have enough information to begin understanding the normal thresholds of activity for your environment.
Google Analytics, which was designed for the ecommerce arena, specializes in revealing whether or not a website achieves its goals based on specific patterns. On an ecommerce site, it would be important to track how often users select items for purchase and complete the checkout process. For a library website, you would select activities that indicate a successful visit.
One of the disadvantages of Google Analytics is its inability to report on activity prior to the installation of the JavaScript snippets. Some organizations use a log-based analytics application to complement this tool so that they have a wider variety of useractivity reports available and can look at historic use patterns.
Evaluating Specialized Apps
Google Analytics and other web server usage-reporting applications work well for tracking general website use. Libraries tend to have many other applications within their environment that perform specialized services, such as their online catalog, federated search applications, link resolvers, digital collections, and repositories, to mention a few. These specialized applications may or may not work well with the general web analytics tools. Most offer their own internal mechanisms for reporting use and may produce their own log files that you can mine for usage trends.
Just as with the general website, it's important to understand normal patterns of use with these specialized applications. When possible, look for patterns that shed light on their effectiveness. For search-oriented applications such as online catalogs and digital repositories, you can study the percentage of searches that produce no results and the ones that return too many. You can also review the search terms that users commonly enter and compare them to the collection's actual content. Such comparisons might reveal common misperceptions that users might have regarding the type of content available.
When possible, try to identify patterns of use that demonstrate a successful visit, such as viewing or printing an image, placing a hold on a book, or any other sequence of events that indicates the user found a particular item as a result of a search. For the Vanderbilt Television News Archive, I regard any video viewed online or a request for a loan as a successful session. For these search-oriented environments, the average number of sessions per day and some measure of the successful versus failed sessions serve as good benchmarks.
Action-Oriented Analytics
Once you've established the benchmark that represents typical use, you can start using the tools to measure the impact of changes within the environment. Google Analytics excels at this. In the commercial sector, it's important to measure the effectiveness of an advertising campaign. Naturally, Google is interested in demonstrating the value of its AdWords program. Does the advertising investment result in increased activity and the sale of more widgets?
The same methodology works with sites outside the commercial arena, such as ours. Does the addition of a new service result in a higher level of activity for the overall website? Does the volume of activity for the new service indicate that users are able to find it, and is it being used successfully? When making website design changes, can you see differences in the overall traffic on the site and the relative popularity of the various pages and services that are available?
If one of an organization's goals is to continually increase the effectiveness of its web-based resources, then web analytics tools provide important information. They allow you to make changes, measure their impact, and make a decision about whether to keep or roll back the change based on user response. Just engaging in the process keeps the library in a mode of continual improvement that's focused on the user and built on empirical evidence. This approach seems better to me than the informal processes that are based on subjective opinion and anecdotal observations.
Libraries have a lot at stake regarding the effectiveness of their webbased resources. For many, the balance between their physical facilities and virtual presence leans more toward more access through the web. I've briefly outlined a few practical approaches that you can follow to gain a better knowledge of how users respond to your website and other webbased resources. Other techniques might work just as well, if not better. My point is to be proactive and follow some type of analytical approach in measuring the use patterns of your resources to help you understand what patrons really want.