Library Technology Guides

Document Repository

Elevating Tech Skills for the Cloud

Computers in Libraries [October 2017] The Systems Librarian

by
Image for Elevating Tech Skills for the Cloud

Those working in technology constantly need to fine-tune their skills. Each phase of technology brings its own set of architectural concepts, programming languages, and computer infrastructure. It's essential to hone expertise and practical skills that are suitable for the current environment and for emerging trends. Those entering the profession will naturally focus on current and leading-edge technologies as they select courses, internships, or other educational activities.

Established technologists will also need to continually refresh their repertoire. Concepts and programming languages previously learned will continue to be useful and inform a broader perspective, but do not obviate the need to master the technologies prevalent in each new phase. Technologists who do not evolve and expand their horizons will likely see diminished opportunities professionally.

Over the course of my career, technologies have changed drastically, and I have benefited from many opportunities to learn. My earliest library work came at the time of mainframe computers, allowing me to use technologies such as the VSE OS, CICS transaction processing, Bisync telecommunications protocols, BAL programming language, and SAS for reporting and statistics. As a networking specialist, I worked with VTAM networking subsystems, front-end processors, IBM 7171 communications controllers, and an assortment of other components needed to connect chains of 3270 compatible display terminals. While little of this knowledge has any relevance today, many of the same concepts still apply. Working closely with the hardware in the mainframe era has given me a helpful background as each successive generation introduces more abstract models of technology.

The age of microcomputers came next, ushering in a dramatically different slate of technologies. At that time, useful skills included proficiency in MS-DOS and creating scripts of batch commands. Those working with hardware would often need to upgrade memory, installing or formatting disk drives or other components. In the early days of PCs, technical support involved working closely with hardware components, drivers, and arcane configuration procedures. The programming languages I used at the time included Pascal, Basic, and C.

The age of client/server computing brought together servers-more powerful than the mainframes of the previous generation-with high-speed networks and PCs. Useful skills included Windows or UNIX server administration, configuration of network protocols such as IPX or TCP/IP, and configuring and troubleshooting desktop client software. Client/server computing required a broad range of skills related to the hardware and supporting infrastructure components throughout the distributed environment.

Various types of web-based computing have become a growing part of the technology infrastructure for the last 2 decades. In the early phase of web applications, common approaches included components such as Linux servers, the MySQL relational database, and scripting languages such as Perl and PHP. This basic model of computing remains pervasive, but with much more sophistication and complexity, as well as a vast array of supporting frameworks, toolkits, and specialized infrastructure components. A web developer today would be able to take advantage of JavaScript libraries (such as bootstrap or jQuery) and might write code for user interfaces in Ruby on Rails or Python, while creating a server-side application in Java or C++.

Working Higher in the Stack

Cloud computing stands out as one of the most important technology trends of the current era. This broad umbrella covers specific types of computer infrastructure deployment such as infrastructure-as-a-service (IaaS), SaaS, or application-service providers. These cloud technologies share the key precept of relying on computing infrastructure housed and managed by a service provider rather than operating servers on premises. Service providers can take advantages of efficiencies of managing large data centers, allocating only the specific level of computing power needed for any given customer application. These data centers can have very high levels of availability and security, deploying redundant hardware and software components, as well as specialized engineers to design and manage storage, computing, security, and networking.

Cloud computing providers invest in all these layers of hardware and software so that their customers do not have to. This trend means that specialists in server infrastructure are gravitating toward the organizations operating large data centers. Libraries, as consumers of cloud services, will progressively move away from the need to house and manage local servers and related infrastructure. Most libraries today are in a transition phase in which they might subscribe to some aspects of their supporting technical environment via SaaS and continue to operate other applications through software running on servers they manage in-house. If the current trends persist, it seems likely that in the next 5 years few libraries will find it viable to operate local servers.

Even for scenarios in which the library might want to manage an application directly, it is likely to be deployed on servers and storage residing on IaaS providers, such as Amazon Web Services (AWS). Libraries involved in local development of applications today are likely to work with instances of infrastructure from AWS, given the ease and low expense involved in launching servers, OSs, databases, and other supporting modules relative to procuring the same components in the physical world. As existing server equipment reaches the end of its life, it is much more likely that an organization will shift to some type of cloud or hosting arrangement rather than purchase new servers it will then need to manage for the next decade.

This trend away from local computing to cloud computing comes with major ramifications for people involved with computing in libraries. Those involved with hardware and OS administration will likely see less demand for that work in the future. This change can be positive for both the technologist and for the library. Systems administrators need to spend less time with behind-the-scenes tasks such as patching and tuning OSs and databases, which are less likely understood or appreciated by others in the library outside the technology department. That way, technologists will have more time to work with other aspects of the technology environment with more conspicuous results. Rather than working the lower tiers of the technology stack, activity moves upward. In most cases, this results in more interesting and rewarding work for the technologists. The library benefits from less competition for the attention of their technologists that's otherwise taken by routine systems administration tasks. Instead, efforts can be focused on more tangible activities (such as improving efficiency and workflows for library personnel) and on continually improving the interfaces and services offered to library patrons.

Most of the concepts and skills applicable to web programming remain important as a library moves the deployment of its services from local hosting to cloud services. However, the overall portfolio of skills needed changes shape substantially. While it is a moving target, there are some areas in which it would pay to invest time and attention to acquire or deepen skills. A few topics come to mind for technologists to master in the area of cloud technologies, including the items below.

Deployment of IaaS. In the same way that the library technology personnel would be expected to have current knowledge of the hardware arena and be able to appropriately size the technical specifications of a server for any given application, it is likewise important to understand how to deploy appropriate resources for a project via an IaaS provider. AWS has become the dominant provider, and some organizations may work with one or more of its competitors. While the specific offerings will vary among IaaS providers, most of the concepts and service options apply one way or another. Some projects may require a simple environment (such as a server instance, with a specific OS, a preinstalled database management system, web services, and other software components) with a particular level of processing ability, memory, and storage capacity.

Other projects may require clusters of multiple servers to power production services. More complex environments may require independent containers be deployed across the server clusters, using technologies such as Docker managed through environments such as Kubernetes. You will often need load balancers for applications with hightransaction loads to route requests to available containers supporting the requested service. For most libraries, the deployment of the IaaS environment can be easily managed by one of its technical staff members. Others involved with very complex environments may need to hire a specialist or invest in training existing technologists in the nuances of IaaS deployment and orchestration.

APIs. For those involved with programming, especially with functionality or data exchange across different systems or services, it is essential to have a strong grasp of APIs. Although not specific to cloud technologies, APIs have become the basic glue for assembling services into coherent applications or interfaces. As technical infrastructure becomes increasingly abstract or virtual, access to data through native file access or even SQL commands becomes less possible. Multitenant platforms, for example, cannot allow any user to gain access via low-level mechanisms since it may support multiple organizations that each expect its data to be private.

However, APIs are able to provide controlled access to data in multitenant platforms. This issue is representative of shifting from a local environment to cloud-based services, in which previously used mechanisms that are closely tied to specific hardware or software components will be superseded by higherlevel APIs performing similar tasks, but that are agnostic to the underlying infrastructure components. As programmers work with APIs, they will also need to be skilled in working with serializations such as JSON or protocol buffers. XML-once a favored syntax for data exchange-sees less use in current applications as more compact and efficient protocols gain favor.

Microservices architecture. Especially in large-scale and complex environments, the microservices architecture has rapidly gained favor as the preferred approach for creating web-based applications. Rather than bundling code and libraries into a single, executable module, this breaks functionality into many separate services. Each one resides on its own separate technology infrastructure, including OS, data stores, and programming environment. A microservice will be programmed to perform a specific task, and multiple services may be chained together to perform complex operations. Since each microservice may not necessarily require the full resources of a physical or virtual server, they're usually deployed via Docker containers, which can easily be initiated or replicated as needed. Applications may include an API gateway that routes requests to the appropriate container offering a needed service.

Although conceptually similar to the services-oriented architecture that previously (and often still) prevails for building complex enterprise applications, microservices differ in substantial ways. Rather than relying on an enterprise service bus, which performs many supporting tasks for services, microservices usually operate without the assistance of complex middleware. Microservices also break apart the complex databases often associated with those created under the services-oriented architecture. While there may be occasions in which data may be derived from shared data stores spanning the application, in most cases, each microservice independently manages any data needed within its scope of functionality. Microservices, each built by a small team, enable flexible and rapid software development. Rather than having to rebuild and retest an entire complex application, new functionality can be created in new services, or existing services can be enhanced independently. This eliminates the need to rebuild and retest the entire application for each minor enhancement produced.

In general, cloud computing comes with the need for many more abstract methods of working with technology than any of the previous eras. These concepts are just some general areas that technologists preparing for more involvement in cloud computing might consider as they update their portfolio of skills. It isn't meant as a comprehensive list. They are some initial suggestions, which will vary according to the products and technologies that the library may have implemented or is considering for the future. Each new cycle of technology requires technologists to update their mental toolkit. The trend toward cloud computing has become well-established and will reshape the roles of technical personnel in libraries. Those who take a proactive approach to gain appropriate skills should thrive as libraries increasingly shift their technical infrastructure to cloud-based services.

Permalink:  
View Citation
Publication Year:2017
Type of Material:Article
Language English
Published in: Computers in Libraries
Publication Info:Volume 37 Number 08
Issue:October 2017
Publisher:Information Today
Series: Systems Librarian
Place of Publication:Medford, NJ
Notes:Systems Librarian Column
ISBN:1041-7915
Record Number:23090
Last Update:2025-03-26 14:35:01
Date Created:2017-12-08 11:26:44
Views:766