Skip the navigation

Planning for virtualization? Beware of server overload

Vendors claim you can pack dozens of virtual machines inside one physical server. But that's a bad idea for heavy-duty applications.

By Sandra Gittlen
February 8, 2010 06:00 AM ET

Computerworld - As virtualization stretches deeper into the enterprise to include mission-critical and resource-intensive applications, IT executives are learning that double-digit physical-to-virtual server ratios are things of the past.

Virtualization vendors may still be touting the potential of putting 20, 50 or even 100 virtual machines (VM) on a single physical machine. But IT managers and industry experts say those ratios are dangerous in production environments and can cause performance problems or, worse, outages.

"In test and development environments, companies could put upwards of 50 virtual machines on a single physical host. But when it comes to mission-critical and resource-intensive applications, that number tends to plummet to less than 15," says Andi Mann, vice president of research at Enterprise Management Associates Inc. in Boulder, Colo.

In a 2009 study of 153 organizations with more than 500 end users, EMA found that, on average, enterprises were achieving 6:1 consolidation rates for applications such as ERP, CRM, e-mail and databases.

The variance between the reality and the expectations, whether it's due to vendor hype or internal ROI issues, could spell trouble for IT teams. That's because the consolidation rate affects just about every aspect of a virtualization project -- budget, capacity and executive buy-in. "If you go into these virtualization projects with a false expectation, you're going to get in trouble," Mann says.

Indeed, overestimating physical-to-virtual ratios can result in the need for more server hardware, rack space, cooling capacity and power consumption -- all of which cost money. Worse yet, users could be affected by poorly performing applications. "If a company thinks they're only going to need 10 servers at the end of a virtualization project and they actually need 15, it could have a significant impact on the overall cost of the consolidation and put them in the hole financially. Not a good thing, especially in this economy," says Charles King, president and principal analyst at consultancy Pund-IT Inc. in Hayward, Calif.

Why is there a disconnect between virtualization expectations and reality? King says that up to this point, many companies have focused on virtualizing low-end, low-use, low-I/O applications such as test, development, log, file and print servers. "When it comes to edge-of-network, non-mission-critical applications that don't require high availability, you can stack dozens on a single machine," he says.

Bob Gill, an analyst at TheInfoPro Inc., agrees. "Early on, people were virtualizing systems that had a less-than-5% utilization rate. These were the applications that, if they went down for an hour, no one got upset," he says.

That's not the case when applying virtualization to mission-critical, resource-intensive applications -- and virtualization vendors, on the whole, have been slow to explain this reality to customers, according to some analysts.



Our Commenting Policies