My investigations of library technologies often involve visiting large numbers of library websites. These sites form a body of interesting contrasts. Some approach a state-of-the-art user experience and bring together a rich collection of content and services in a coherent and seamless suite. Others seem a bit more cobbled together and based on outdated design principles. This unevenness in the quality of user experience offered across library resources seems largely a function of financial resources and access to expertise.
Developing web resources involves a variety of skills, such as graphic design, information architecture, and workflow analysis. A high-quality website delivers the content resources and services that are central to a library's mission. It is vital for these sites to have great visual appeal and ease of use. It takes a team of individuals spanning multiple areas of expertise to collaborate in the initial development and ongoing maintenance of a library website. This collaboration may take place entirely within the ranks of the library itself or may be accomplished with the assistance of an external firm or consultant.
Libraries invest in high-quality resources and search tools and create unique content. Especially in larger organizations, many components must be assembled and organized coherently to develop an information-rich environment that is easy to navigate and that offers an appealing interface. The creation of an effective user experience relies on skills from many disciplines-art, graphic design, humancomputer interaction-as well as expertise with the tools and technical components that power the site.
Even the most visually well-polished websites may be unwell internally. Technical misconfigurations, programming errors, or deviations from standards can impede the site's success. Errors may not impact the visual appearance of the site, but in some circumstances, they may lead to inconsistent results, security vulnerabilities, or privacy violations.
Libraries should perform a thorough technical review of a website before placing it into use. A rigorous assessment should be conducted whether the site was developed in-house or by an external consultant or firm. When working with a developer, contract terms should include the validation of each technical protocol and disclosure of all add-in components. Acceptance of the project should include a technical presentation or discussion in which the developers demonstrate the technical validity of their work and explain any use of cookies, JavaScript libraries, add-ins, or other components. Although not intended as a comprehensive checklist, some of the tools and concepts that follow illustrate some of the components of a technical review.
Encrypt Everywhere
All pages within a library's website must be delivered using the HTTPS protocol, which encrypts content as it traverses the internet, preventing interception and exposure of personal information. Reviewers should verify that the web browser presents each page as secure, with a valid digital certificate issued to the organization by a trusted certificate authority. All embedded components-including JavaScript libraries, style sheets, code snippets, or images-must be delivered via secure sources. Libraries should require comprehensive delivery via HTTPS as a fundamental requirement, with no exceptions.
Validate HTML and Style Sheets
Each page within a library website should be delivered using valid HTML coding. Web browsers often will ignore errors and render the page as its designers expected. However, these errors may be treated differently across browsers and may be especially problematic for screen readers. Even the most cursory technical review of a website should test the validity of the delivery of the HTML coding. The World Wide Web Consortium (W3C) provides a free, fast, and reliable validation service to help you do so (validator.w3.org). Likewise, style sheets should be validated. Since any given style sheet may be included in all pages delivered by a server, any errors can propagate widely (see jigsaw .w3.org/css-validator).
Ideally, each page and style sheet must pass validation with no errors. In some cases, the technical errors reported may be intentional or unavoidable. Fix errors when possible, but at least understand the cause of the error and any circumstances that prevent its elimination.
Inspect Cookies
Browser cookies are a basic component of web infrastructure. They store data in a way that enables a site to present multiple interactions with a visitor as a coherent session and to retain settings between sessions. Session cookies used only by a specific website support pragmatic personalization without compromising privacy. Third-party cookies can be shared among multiple sites, with the possibility of carrying personal data to unknown destinations. A technical review of a library website should include a careful inspection and inventory of all of the cookies generated or used by the site and its related components. Developers must be able to detail the origin and purpose of each cookie and the implications for user privacy. Google Chrome offers a convenient tool for inspecting cookies by right-clicking on the security icon to the left of the current URL. You can use this tool to list all of the cookies associated with the site and their contents.
Most sites, especially in jurisdictions subject to the European Union's General Data Protection Regulation (GDPR), must present a user notification or an opt-out before activating cookies.
Validate JavaScript
JavaScript enables developers to create sophisticated interfaces and manage data elements beyond what can be accomplished easily with basic HTML coding. Similar to any other programming language, scripts must be written carefully and thoroughly tested. Support for some JavaScript effects often vary according to rendering engines or browser versions, requiring testing on each of the major web browsers.
JavaScript frameworks (such as Angular or Bootstrap) or libraries (such as jQuery or React) have been created, tested, and released by professional and reliable development communities. But it's also common to implement JavaScript code snippets from less reliable sources that may not work exactly as expected or may produce unintended side effects, such as introducing tracking agents for advertising networks or other interactions that are inappropriate for a library environment. Any JavaScript coding-locally created or borrowed-must be carefully inspected to ensure that it performs only the task intended, that it will operate across all devices, and that the page will also function well with devices that have JavaScript disabled.
Obvious JavaScript errors can be detected with the Developer Tools option for Google Chrome or through similar options in other browsers. The Console section of Chrome's Developer Tools presents any warnings or errors in JavaScript or other coding.
Test Mobile-Friendliness
All pages presented through a library website should be usable by any type of device. Libraries should require a responsive design that's able to accommodate mobile, tablet, and full-sized screens. Embodying a responsive design must be a fun- damental consideration for any web development project. In addition to manual testing of each page on mobile and other devices, Google offers a tool for validating whether a page conforms to basic requirements for mobile use (search.google.com/test/mobile-friendly). Libraries should also keep in mind that Google will penalize pages in its search results that do not work on mobile devices.
Check Support for Disabilities
Library websites must also provide the best possible experience for their users with disabilities. This aspect of design may require a specialist to provide advice on the best layout options, color choices, or other considerations. A cluster of standards and practices inform the design and coding techniques for good web accessibility. The W3C offers general guidance (w3.org/WAI/ fundamentals/accessibility-intro), and WebAccessibility.com provides a tool for validating the internal coding of pages for accessibility (webaccessibility.com).
Detect Unwanted Cookies and Trackers
It is essential to understand every element within your site that interacts with sites or resources that are operated by external organizations. Trackers can expose user behavior to advertising networks, ecommerce companies, or other organizations. The placement of these trackers is not necessarily intentional, but a side effect of code snippets used for other purposes. Code may also be included to assist developers with monitoring site performance or in assessing design issues.
Several tools examine pages and report on any embedded elements that may track user actions. Blacklight from The Markup (themarkup.org/ blacklight) will inspect a website and report all of the tracking elements, such as ad trackers, third-party cookies, session recording services, Facebook pixel, and Google Analytics remarketing audiences.
Blacklight reports the organizations associated with any trackers. For example, if the page uses Google Analytics, it will report its interaction with Alphabet, Google's parent company. Those that raise concern would include elements such as the Facebook pixel or Google Analytics remarketing audiences feature that may transmit session data to their respective organizations. Each entity on a webpage that interacts with external services should be explained during a technical review. Any trackers not explicitly authorized by the library should be removed.
Purge Development Widgets
Many tools used for development and testing may include the ability to capture keystrokes and other user interactions. While useful for testing, this type of software should be employed carefully in a library context since it can expose specific data (such as what a user types in for a query, content selections, and other transaction details). These performance and development widgets should be identified by Blacklight or similar inspection services.
Ensure that performance monitoring or other add-ins needed for design and development have been removed. It can be quite helpful to embed code in a website that enables developers or designers to assess the speed of the website, to track user behavior on the site, or to iteratively test alternate layout or design options. Leaving these components in the production site may enable site administrators (or even the organizations that provide the tools) to monitor the site, which could compromise the privacy of visitors.
Keep After It
The technical review of the internals of a library website should be an ongoing conversation between the key library personal responsible for its website and the technologists involved in its construction and maintenance. Adherence to standards and conformance with all applicable privacy policies is an administrative concern, not subject to a developer's discretion. Especially when using external consultants, the technical developers may focus more on the features of interaction with third-party services rather than on the privacy issues. Libraries should also perform periodic technical reviews of their site to ensure that any ongoing changes have not introduced errors or unintended tracking agents.
An ongoing review process benefits the organization by making those responsible for the services and content provided by the website more aware of its technical underpinnings and associated vulnerabilities. These reviews also ensure that technical developers gain a better appreciation of the policy and administrative implications of each site component.
Marshall Breeding is an independent consultant, writer, and frequent library conference speaker and is the founder of Library Technology Guides (librarytechnology .orgl. His email address is marshall .breeding@librarytechnology.org.