The whys and wherefores of virtualization

Welcome to Virtually Everything.  This is my first entry so I thought I would start at the beginning, back when we knew what virtualization was for.  It allowed multiple server loads to be consolidated onto a single physical server. That simplicity quickly changed. This blog will track all the latest news and developments in all forms of virtualization and attempt to make sense of the rapid innovation in this space -- helping you navigate and review the hows, whys and wherefores of virtualization.  Hope you enjoy the read.  I welcome topics, discussion and comments anytime. Please give me feedback in the comments. 

In the data center, virtualization is becoming the underlying technology for many different IT needs, from high availability to disaster recovery, which takes it from being a way of controlling costs for under-used applications to becoming a strategic part of how IT is managed.

Server virtualization's initial use in consolidating multiple workloads onto a single server had a profound impact on how we think about data centers but introduced a number of management challenges. Not least of these was it became more difficult to identify where all the workloads in the business were actually running. In the past you could point at the physical machine, but with virtualization more than one workload could be on a single machine and the virtual machine you are interested in could have been moved to another server. The solution to these challenges actually lead to a whole new way to manage virtual machine images and a far more strategic role for server virtualization as the operational infrastructure of the data center.

From a client computing perspective, virtualization is also changing how we think about delivering desktops to users and, in a similar way that server virtualization changed focus, desktop virtualization is going through rapid evolution.

Early desktop virtualization implementations focused solely on hosting multiple client operating systems on hypervisors on servers. This was great where the requirement was to support small groups of remote users -- keeping the client image centrally drastically reduced the need for desk-side support. For the more typical corporate user, though, it had little benefit. Our story might have ended there but for one of the fundamental characteristics of virtualization: its ability to isolate and keep software components separate.

We have long had an understanding that a PC image consists of three fundamentally different types of data: operating system, applications and user environment. We were also aware that we could not manage each in isolation, as we would wish, because they would each modify the others. This is where virtualization's ability to isolate comes in -- by isolating the operating system from the applications and user environment and hence, it allows us to manage each in the most appropriate way. Ultimately, this will deliver a far more manageable PC platform, something that has been needed for a long time.

Next up ... understanding changes to the operating system.

Martin Ingram is the VP of strategy at AppSense, a leading provider of user virtualization technology to enterprise organizations.

FREE Computerworld Insider Guide: IT Certification Study Tips
Join the discussion
Be the first to comment on this article. Our Commenting Policies