The Virtual Desktop Is Here
Streaming virtual applications to users can greatly reduce IT support costs, but there are some caveats.
Computerworld - Parsons Corp., a $3 billion construction and engineering company based in Pasadena, Calif., once had hundreds of fat clients on the desktops of its engineers. That spelled nothing but trouble for the IT staff. “We had cadres of IT folks who would go around with CDs, and they’d push the user aside and say, ‘Hey, go have a smoke while I download this application,’” says CIO Joe Visconti.
That was only the beginning. “If it was something like AutoCAD, it could take an hour to load, then the IT guy would have to configure it,” he recalls. “Then he’d get a call a few minutes later saying, ‘Hey, this is not running. Help me.’ Then, as soon as there was a patch or new release, someone would go through all the desktops again.”
Keeping track of which users had which versions of an application, who had various patches and so on was a nightmare, Visconti says. And if a user needed multiple versions of software for different engineering projects, the versions had to be installed and uninstalled as his needs changed.
That was the lay of the land in most IT shops as the century turned, and it’s the way things still are today at many companies. But new models of computing are taking hold as IT looks to reduce the cost and complexity of managing PCs. Among these are the virtualization and streaming of desktop applications, with the goal of moving the management of desktops to the data center, where it can be done more easily, more securely and often more cheaply.
The virtualization and streaming of applications evolved from a long heritage. In the 1970s, dumb terminals connected to mainframes. The big desktop boxes were aptly named; all they did was collect keystrokes and deliver boring green text. Then in the 1980s came minicomputers and PCs, connected in a paradigm-busting arrangement called client/server computing. These desktop machines were far from dumb; they were called “fat clients” because they were fully loaded with processors, memory, disk drives, I/O devices, operating systems and application software.
In the 1990s, things got a bit more complicated. IT managers discovered that still more tiers could bring even better performance, flexibility and scalability. Applications could be broken into presentation, business logic, data access and data storage layers, each residing where it worked best.
At the same time, there was a backlash against the cost and complexity of fat clients, and some IT managers turned to “thin clients” and “network computers,” basically dumb terminals with a grade-school education.
But these days, operating system upgrades, new applications, bug fixes and security patches have escalated in frequency. Users are more likely to install their own applications and even demand that IT install special software for them. Substantial portions of IT staffs travel from desktop to desktop, keeping PCs running properly.
Enter virtualization — which isolates the application from the operating system and other applications — and streaming, which delivers the application to the user.
By moving the management of desktops to the data center, this combination can reduce hundreds of desktop environments to one that’s under lock and key, while giving the user the illusion that he still has a fat client. Or a server can hold multiple desktop images, each tailored to a specific user’s work based on profiles stored in a directory.
Then, when the user needs them, those applications — and sometimes complete operating environments — can be “streamed” over the network to the desktop, where they execute locally, without the server and communications overhead that comes from traditional client/server or thin-client computing. Some products allow the streaming of just those pieces of software actually needed for that session — perhaps just 20% of an application’s code — minimizing the demand for bandwidth, memory and disk.
- Path Selection Infographic Path Selection Infographic
- Hyperconvergence Infographic A wide range of observers agree that data centers are now entering an era of "hyperconvergence" that will raise network traffic levels faster...
- Preparing Your Infrastructure for the Hyperconvergence Era From cloud computing and virtualization to mobility and unified communications, an array of innovative technologies is transforming today's data centers.
- Increase IT Performance from the Enterprise to the Cloud with WAN Optimization Massive consolidation and data mobility, enabled by virtualization, have radically altered how we build servers, design applications, and deploy storage for the emerging...
- Live Webcast
Transforming Finance, Procurement and Supply Chain Effectiveness with Cross-Functional Analytics
Date: May 6th, 2014
Time: 1 PM EDT
Attend this Webcast to find out how Oracle's packaged analytic applications enable line-of-business managers to examine all...
- Video Stream Quality Impacts Viewer Behavior This scientific white paper, using statistical data from Amakai's streaming network, analyzes how changes in video quality cause changes in viewer behavior.
- Service-Enabling CICS Applications: Best Practices This informative webcast provides an informed, thorough look into CICS service-enablement options and how they can affect your environment. You'll learn how to... All Applications White Papers | Webcasts