Library Technology Guides

Document Repository

Benchmarking technology: A theory of penultimacy

Computers in Libraries [March 2003] The Systems Librarian

by
Image for Benchmarking technology: A theory of penultimacy

In making decisions about procuring technology components for a library, I could reduce my opinions on the topic down to a single word-penultimate-but that would make for a very short column. The word fits, though. The dictionary defines penultimate as "the next to last member of a series." When it comes to technology decisions, look not for the latest and the greatest available today, but consider buying a notch down from the top. If you follow this rule of thumb, your library will benefit from computer equipment that performs at a more-than-adequate level, while providing a great value for the dollars spent.

When managing the technical infrastructure, a systems librarian (or equivalent) must regularly make decisions about what hardware and software will meet the library's needs. Many factors prevail, including functionality delivered, cost constraints, relative value, and level of risk. These factors apply to the types of technologies to consider as well as to the specifications of items in individual categories of hardware and software. Whether you are considering which type of network media to install, which operating system to deploy, or what model of desktop computer to purchase, you can use the penultimate principle. While this approach works well for libraries, it may not be well-suited for other environments. Some organizations place much higher demands on their computing equipment and need to buy much closer to the leading edge of what's available. Companies that deal with mathematical modeling, engineering, or high-transaction financial applications come to mind. Other organizations have minimal computing needs, have small budgets, and need to buy at the low end. It is my experience in acquiring technologies for libraries that leads me to this approach, but it doesn't necessarily generalize to other environments.

One of the main benefits of the principle of penultimate buying lies in its protection from new technologies that companies release with hype and promise, and then fail to catch on. There have been a number of technologies introduced over the years that initially seemed to be the wave of the future. Notable examples include the OS/2 operating system and OSI networks. These technologies failed in the marketplace at the first generation of deployment. Those who invest heavily in a technology that doesn't make it past the second generation and into the industry mainstream can experience expensive technological setbacks. Waiting for the second generation of a product cycle greatly reduces this type of risk.

Buying Computer Hardware

It rarely makes sense for a library to purchase the highest-performance technologies available. In the cycles of technology development, you pay quite a premium for buying the latest and greatest available. You can easily pay double the price for a computer based on the processor that has just been released. While its performance may be stunning, the relative value may be low given the high price and a lack of applications that can genuinely take advantage of its top-end performance.

Libraries rarely have applications that demand state-of-the-art performance. While we generally want to keep an arsenal of computers that run today's software with reasonable speed, few libraries have applications where super-high performance will yield increased productivity. Even the servers that run our automation systems and other Web-based applications work quite well with penultimate technology.

It also doesn't make sense to buy on the low end of available technology. Each generation of computer operating systems and application software makes greater demands on hardware. The processor speed, amount of memory, and disk storage available need to keep pace. There comes a time when computers will not run current software with satisfactory performance. Most libraries expect about 5 years of service from each computer they purchase. If you purchase hardware that is already at low-performance capabilities, you will find that they will feel obsolete 1 or 2 years sooner. In most cases, the penultimate hardware with a 5-year service period will work out to be less expensive than a low-end computer at 3 years.

The cycle of technology development runs from basic research to technical development to early production to mass production. Products do not stay in mass production long-they quickly become displaced by the products of the next cycle. Once abandoned from mass production, they hit the clearance shelf. Again, these products can be cheap, but they may not be a good value given their progress on the road to obsolescence. Buying the hardware currently in mass production will generally yield the best value. If you look at prices of the top-end technology in early production cycles today, in only a few months you'll generally find the same level of hardware available at significantly lower costs.

So last year's top-end technology translates into this year's commodity. After about a year, the technology has settled into stability and has entered the realm of mass production, significantly reducing both risk and price. Today, for example, you can buy a computer based on a 3.06-GHz processor. Computers with this kind of power have just been released, targeting primarily the high-performance workstation market. While I might really covet a machine with that kind of performance, it would be really hard to justify the 30-to 40-percent increase in cost compared to the same system configured with a 2.4-GHz processor.

Something that I've caught myself saying lately is that you would have to try pretty hard today to get a bad deal in buying a computer. The prices for hardware seem to be lower than ever. It used to be that the amount that you paid for a business-class computer stayed constant-about $1,500-but what you got for that money was significantly more powerful every year. Lately it seems that the price is lower, with even larger increments in computing power. With the consolidation that's taken place among the hardware manufacturing vendors, all those remaining seem to be reputable, with strong competition for quality of equipment and service.

Making Software Decisions

I follow the same approach when making decisions about software-especially for operating systems. The same broad concerns exist-you don't want to run an operating system with waning support, but you also need to avoid the problems associated with newly minted operating systems. In the OS arena, it generally takes about a year before the environment proves itself stable enough for a production environment. That applies both to desktops and servers.

I get lots of inquiries about what operating system a library should use for its desktop computers. First of all, I steer libraries to the class of OS's designed for business use, not those built for the home consumer market. Windows NT, Windows 2000, and XP Professional Edition will deliver a better computing environment than the Windows 95, 98, ME, and XP Home Edition lines.

Given the options among the business-class versions of Microsoft operating systems, in today's environment Windows 2000, as the penultimate choice available, would prevail in my judgment. Windows NT, which we used for a number of years with good results, no longer has good support. While some security-related fixes continue to be released, it is clearly obsolete as a production operating system. While I use Windows XP for my home computers, it is only now coming to the point of achieving the level of stability needed for wide deployment in a production environment where we need to support hundreds of them. Again, the penultimate choice fits the bill for a stable, low-risk operating environment.

Example: Wireless Options

This principle of penultimacy can be applied to specific choices of hardware and software, and generally applies to standards and protocols. If we look at this issue's theme of wireless technologies, we might see where this approach can guide us. Today, multiple types of wireless networking equipment are available, each based on a different underlying transmission standard. The original Wi-Fi equipment based on the 802.11b standard has been around for a while. Its throughput of 11 Mbps on the 2.4-GHz band seems fast compared to dial-up connectivity, but falls short compared to wired networks. In the last year or so, equipment has come out based on the 802.11a standard. It uses the 5-GHz band and can deliver up to 54 Mbps-quite an improvement over the original version. The main problem lies in its lack of interoperability with existing 802.11b equipment. The flavor of wireless that is just becoming available, called 802.11g, provides the higher-performance throughput of 54 Mbps, while offering compatibility with existing 802.11b equipment since it also works on the 2.4-GHz frequency. (Note that all the values given here for throughput represent the theoretical maximums possible using the protocol-real-world values prove to be less than half of the optimal figures.)

Among the choices in wireless networks, the broad category that best fits the next-to-last principle would be 802.11a. The risk factor of 802.11a networks is low, since there is little doubt that this technology will catch on-its deployment is widespread enough to give confidence that it will hit the mainstream. While the costs of the early 802.11a products were quite high, you can now buy access points and wireless cards at prices only a notch above 802.11b equipment. A common genre of equipment available now hedges the problem of interoperability by marrying the 802.11a and 802.11b standards. These hybrid access points transmit and receive through both of the standards, allowing computers with 802.11a cards to operate at the higher level of performance while still supporting the large cadre of 802.11b cards that continue to prevail. The 802.11g flavor of wireless networking-despite its technical advantages-remains a little too risky and high-priced for the typical library's needs. The manufacturers continue to wrangle over the details of the standard, its availability is limited, and the costs of the equipment are still fairly high.

Room Left for Judgment

While the approach of buying penultimate technology works as a broad principle, it doesn't eliminate your need to carefully study and review all the available options. While it's easy to say in general terms that the best values lie a step down from the current state of the art, there's a lot of variation in how big that step might be. It doesn't eliminate the decisions that need to be made regarding configuration details such as the amount of memory and disk space, the video chipset, or quantity of processor cache, much less the companies and brands available. While the "devil is in the details," having a good benchmark can be of great help as you work through the complex process of selecting and procuring the technology that makes your library run well. I've found that selecting penultimate technology works well toward the ultimate goal, that of finding the best value-the best possible technology at the lowest cost.

Permalink:  
View Citation
Publication Year:2003
Type of Material:Article
Language English
Published in: Computers in Libraries
Publication Info:Volume 23 Number 03
Issue:March 2003
Page(s):44-46,52
Publisher:Information Today
Series: Systems Librarian
Place of Publication:Medford, NJ
Notes:Systems Librarian Column
ISBN:1041-7915
Record Number:10344
Last Update:2025-04-21 09:35:41
Date Created:0000-00-00 00:00:00
Views:955