Written by Bradley Morgan

Founder, Eight Bit Software Co.



Introduction

 

Computer networking has profoundly impacted the business world.  Barriers that existed between companies for most of the 20th century are giving way to accommodate the need for partnership-based opportunities afforded to them through e-business.  As the need for interoperability and flexibility increases, IT administrators are scrambling to clean up the debris from the patchwork client-server systems that were pieced together hastily in the storm of the 1980s and 90s.  The push for “universal computing” has begun.  Much work has been done in the last few years to promote this new initiative.  There have been breakthroughs in technologies such as distributed processing (a.k.a. grid computing), “plug-and-play” or “blade” server farms, networked attached storage devices, and many others that support the need for more adaptable technology.  In this same mold, the rise of virtualization into this new IT model will drastically change the way information systems are run, by allowing flexibility in both hardware and software systems of which the industry has never seen the like.  Through the adoption of virtualization, the CIO will be able to empower the enterprise with systems that adapt quickly to business demands and new opportunities, a capability that will prove invaluable in today’s volatile commercial landscape.

 

The roots of virtualization are best seen in the computer “time sharing” practices of the late 1950s and early 1960s.   Time-sharing[1] was necessary in these distributed computing environments because the technology was extremely expensive.  It was not practical to dedicate a computer system to a single user, thus a scheme for dividing the resources among many users was developed.  These schemes often used “executive programming” which employed a combination of software and hardware in order to delegate (based on a specified time interval) which user would receive attention from the central processing unit at a particular time (Popell, 1966).  As we will see, this process is similar to that of what we know today as virtualization, in that a layer of abstraction is created in order to logically assign the use of a computer asset[2].  A related concept, multiprogramming[3], also garnered much attention in this time frame.

 

In 1967 IBM announced the IBM\360 version 67, which was the first computer to contain “virtual memory,” a method in which disk space is used to expand the RAM size of a machine.  Later, in the 1970s, the emergence of the “virtual machine” came about.  With the virtual machine, an entire system (software and hardware) could be emulated in a contained environment.  The virtual machine is perhaps the first form of virtualization, as we know it today.  Obviously, these last two instances are an influence on the nomenclature of what we now refer to as virtualization.  With the introduction of Intel’s recent release of its “Vanderpool” technology (and AMD’s subsequent “Pacifica” technology), which provides hardware native server virtualization functionality, the concept of virtualization has taken one step further toward becoming a high-impact, common practice in the enterprise.

 

 

 

Defining Virtualization

 

Because of the wide variety of ways in which the technology is deployed, it is difficult to define virtualization in a general manner.  Singh (2004), in his article An Introduction to Virtualization attempts to do so, stating that virtualization is “a framework or methodology of dividing the resources of a computer into multiple execution environments, by applying one or more concepts or technologies such as hardware and software partitioning, time-sharing, partial or complete machine simulation, emulation, quality of service, and many others” (Singh, 2005).  However, the author admittedly leaves out cases in which computer resources are combined rather than separated, such as in storage networks, and unsuccessfully tries to pinpoint the sub-technologies used.  In order to find the common thread that ties all virtualization concepts together, it is first necessary to look at the various practices in use today.

 

Server Virtualization

Server virtualization enables multiple virtual operating systems to run on a single physical machine, yet remain logically distinct with consistent hardware profiles (Burry & Nelson, 2004).  To the contrary, server virtualization can often take the place of the costly practice of manual server consolidation, by combining many physical servers into one logical server. "The idea is to present the illusion of one huge machine that's infinitely powerful, reliable, robust and manageable - whether it's one machine that looks like many, or multiple machines tied together to look like a single system" (Brandel, 2004).  The focus of server virtualization is on maximizing the efficiency of server hardware in order to increase the return on investment for the hardware.

 

Figure 1: Server Virtualization

 

Operating System Virtualization

Through virtualization of operating systems, a single computer can accommodate multiple platforms and facilitate their operation simultaneously.  This description is similar to the aforementioned server virtualization, but server virtualization alone does not necessarily provide the ability to run multiple platforms on a single server.  Also, in contrast, the goal of OS virtualization is focused more on flexibility than efficiency.  OS virtualization can be used to facilitate what is known as universal computing, where software and hardware work together seamlessly regardless of the architecture or language for which they are designed[4]. 

Figure 2: Operating System Virtualization and Application Virtualization

Application Virtualization

While most of the prevalent virtualization strategies focus on hardware infrastructure, an important and often overlooked method is application virtualization.  With application virtualization (also commonly referred to as service virtualization) end-user software is “packaged,” stored, and distributed in an on-demand fashion across a network.  This virtualization strategy goes hand in hand with the standardized web services initiative that is making waves in the IT industry today.  Virtualized applications use a common abstraction layer, which defines a protocol, allowing them to communicate with one another in a standard messaging format.  Thus, applications can invoke one another in order to perform requested functions. A virtualized application is not only capable of remotely invoking requests and returning results, but also ensuring that the application’s state and other data are available and consistent on all resource nodes executing the application across a grid (Data Synapse 2005).

 

Storage Virtualization

Perhaps the most widely deployed and highly regarded virtualization practice, storage virtualization allows separate storage devices to be combined into a perceived single unit.  Storage virtualization attempts to maximize the efficiency of storage devices in an information architecture.

Figure 3: Storage Virtualization

 

Data \ Database Virtualization

Data virtualization allows users to access various sources of disparately located data without knowing or caring where the data actually resides (Broughton).  Database virtualization allows the use of multiple instances of a DBMS, or different DBMS platforms, simultaneously and in a transparent fashion regardless of their physical location.  These practices are often employed in data mining and data warehousing systems.

Figure 4: Database \ Data Virtualization

 

Network Virtualization

By virtualizing a network, multiple networks can be combined into a single network, or a single network can be separated logically into multiple parts.  A common practice it to create virtual LANs, or VLANs, in order to more effectively manage a network.

Figure 5: Network Virtualization

 

The above descriptions all contain some allusion to either “combining” or “multiplying” a computer asset.  By inserting nonfigurative layers in between hardware and\or software components, more control can be exercised on one or both of the separated assets.  This middle layer is usually some sort of software that allows for the manipulation of the assets through logical separation or combination.  Therefore, this paper will define virtualization as the process of applying a software layer of abstraction in between various computing resources in order to logically separate or combine them.

 

The Impact of Virtualization


Adaptability is becoming an increasing focus for the management of modern business. With new opportunities and threats always lurking on the horizon, businesses must be able to quickly, efficiently, and effectively react to their dynamic environment.  With regard to IT infrastructure, virtualization is perhaps the most effective tool for facilitating this adaptability.  In virtualized systems, the expansion and reduction of technical resources can be performed seamlessly.  Because physical devices and applications are logically represented in a virtual environment, administrators can manipulate them with more flexibility and reduced detrimental effects than in the physical environment.  Through the use of virtualization “tools[5],” server workloads can be dynamically provisioned to servers, storage device usage can be manipulated, and should a problem occur, administrators can easily perform a rollback to working configurations.  Generally, the addition (or removal) of hardware can be easily managed with virtualization tools. 

 

The deployment of new applications for use across the enterprise is easily performed through varied combinations of application, operating system, and server virtualization.  Through virtualization induced “containers,” applications can be isolated from both the hardware and one another, preventing configuration conflicts that often complicate their introduction into IT systems.

 

Increased demand for data or database capabilities can be easily met with data and database virtualization through the management of new DBMSs or physical infrastructure with virtualization tools.  All of these examples illustrate the adaptable nature of the virtual enterprise.

 

In addition to adaptability, the CIO can lower operating costs through implementing virtualization within his or her infrastructure.  There is much inherent efficiency that comes with implementing this type of system, because much of its focus is on optimizing the use of resources, thus reducing overhead for maintenance.  Any element of current infrastructure can be leveraged more fully with virtualization.  Switching costs for new operating systems or applications are lowered with the ability to more flexibly install and implement them.  The consolidation of servers and storage space obviously increases the return on investment for this hardware by maximizing efficiency.

 

Lowering costs will enable the CIO to reallocate the IT budget towards initiatives that are not related to the maintenance of current systems, such as research and development, partnerships, and the alignment of IT with business strategy.  A case in which this is well illustrated is that of Welch Foods.  Through virtualization, Welch’s IT management was able to increase server usage in their infrastructure from five to ten percent, to a range of 50 to 60 percent, allowing them to increase the number of servers per manager to 70-to-1[6], thus reducing expensive labor costs for day-to-day operations (Connor, 2005).

 

IT managers will be able to increase the productivity of employees across the entire organization through a properly implemented virtualization system.  For businesses that rely on in-house application development, an increase in productivity and increased ease of implementation can be seen.  Developers within a platform-virtualized environment can program in languages they are most proficient with.  Debugging and testing applications becomes second nature with the ability to create contained virtual environments.  In this instance, application and systems testing can be performed on a single workstation that employs a variety of virtual machines without the need to transfer and debug code to external computers.  Enterprise-wide testing can be performed in isolated virtual machines, which do not interact with or compromise the resources actually being used on the network.  Users in a virtualized environment do not know or care how their use of IT resources is being optimized.  They are able to access needed information and perform work effectively and simply, without regard to the complexities that exist behind the scenes.

 

In virtualized environments in which resource segmentation takes place, an increase in security can be seen due to the residual complexities for hackers who are not familiar with the configuration of the system they wish to compromise.  For example, in application virtualization, virtual applications can run on multiple servers, causing confusion for attackers who are prevented from determining the physical resource that has been compromised (Lindstrom, 2004).  In the case of virtual machines, which emulate hardware systems, there can be added confusion for would-be attackers.  It’s hard to accomplish much by cracking a system that doesn’t exist” (Yager, 2004).   Another security benefit brought about by virtualization is related to disaster recovery.  In virtualized server systems, it is not necessary to create identical configurations on backup servers as it is with non-virtualized systems.  Because the virtualization layer separates the hardware from the operating system environment, restoring a lost server can be done on a machine with unlike hardware configurations.  It is also possible to perform backups from several servers to one secondary server, creating a less expensive method for high availability and disaster recovery.

 

With any benefit, there is always associated risk.  This is not an exception for the practice of virtualization.  The first problem that IT managers must be aware of occurs in the planning and implementation of virtualization.  CIOs and their staff must decide if, in fact, virtualization is right for their organization.  The short-term costs of an ambitious virtualization project can be expensive, with the need for new infrastructure and configuration of current hardware.  In businesses where cost reduction and flexibility of IT are not currently in alignment with the businesses strategy, other initiatives will be better suited.  That is not to say that virtualization is not right for every environment, because most any organization can reap the benefits of a properly planned virtualization initiative.  It is the timing and scope of such initiatives that must be scrutinized. 

 

Another risk associated with virtualization can occur in businesses that do not have an efficient element of redundancy in their systems.  Because the convergence of resources often takes place in virtualization environments, especially in that of server virtualization, the physical failure of one piece of hardware will impact all virtual elements that it manages.  It is therefore necessary to ensure that backup systems are in place to deal with such problems.  Fortunately, because of the isolation inherent in virtualized systems, backup processes can be greatly simplified.

 

A final problem that can occur in virtualized systems is increased overhead.  The software layers inserted in between resources can chew up processor cycles, sometimes up to double-digit percentages.  Users and vendors say overhead can range from 2% or 3% to as high as 20%, depending on the product and application (Mitchell 2005).  However, the efficiency that virtualization provides for hardware infrastructure should negate any problems associated with overhead when properly implemented.

 

With the positives far outweighing the negatives, virtualization is a technology that will soon be a universal practice. “Ultimately, virtualization will become just a standard layer of the infrastructure stack” (Mitchell, 2005).  As costs for virtualization technology begin to decline, and more hardware manufacturers such as Intel and AMD begin to include built-in virtualization functionality in their products, it will become increasingly difficult to justify not using virtualization in an IT system.  The unmatched effectiveness of virtualization to provide adaptability and reduce costs for the enterprise will empower IT managers and position their organizations for growth. Because of the inevitable induction of virtualization technology into the standard architecture stack, CIOs from all types of businesses should begin sketching the path to their future in virtualization.


References

 

Brandel, Mary. (2004). Wired over server virtualization. Network World Fusion.

        Retrieved March 24, 2005 from http://www.nwfusion.com.

 

Broughton, Eric.  Periscope: Access to Enterprise Data. Retrieved March 24, 2005 from

http://www.tusc.com.

 

Burry, Christopher M. and Nelson, Craig. (2004). Plan on server virtualization.

   Computerworld. Retrieved March 24, 2005 from

   http://www.computerworld.com.

 

Connor, Deni. (2005). Welch’s reaps benefits from server virtualization.

        Network World Fusion. Retrieved April 15, 2005 from http://www.nwfusion.com.

 

Lindstrom, Pete. (2004). Security That’s Virtually There. Information Security. Retrieved

      April 20, 2005 from http://infosecuritymag.com.

 

Mitchell, Robert L. (2005) Ghosts in the Machine. Computerworld. Retrieved April 25,

      2005 from http://www.computerworld.com.

 

Popell, Steven D. (1966). Computer Time Sharing: Dynamic Information Handling for

Business. Englewood Cliffs, N.J: Prentice Hall, Inc.

 

Singh, Amit. (2005). An Introduction to Virtualization. Retrieved March 14, 2005 from

   http://www.kernelthread.com.

 

Virtualization: The Transformation of Enterprise IT. (2005). DataSynapse. Retrieved

March 24, 2005 from http://www.datasynapse.com.

 

Yager, Tom. (2004). The reality of virtual servers. Infoworld. Retrieved April 15, 2005

   from http://www.infoworld.com.

 

 

 



[1] Two major perspectives of computer “time-sharing” exist.  The one illustrated in this text is more user-oriented, while the other is more closely related to what is known today as “multitasking.”  However, either perspective can be used to illustrate early concepts of virtualization.

[2] The process of disk partitioning, a practice that has been used for many years, illustrates a simple form of virtualization.  In this process, a physical device (hard drive) is logically separated into many devices and then abstracted from the drive-controller via software.

[3] Multiprogramming is quite similar to time-sharing, however it is based on idle CPU cycles rather than specified time intervals.

[4] This does, however, increase efficiency for enterprises that employ very heterogeneous information architectures.

[5] Administrative software which resides in the virtualization layer

[6] The industry average of servers per manager is 30-to-1 (Connor, 2005).