Data center density hits the wall
Why the era of packing more servers into the same space may have to end.
Computerworld - Industrial Light & Magic has been replacing its servers with the hottest new IBM BladeCenters -- literally, the hottest. For every new rack ILM brings in, it cuts overall power use in the data center by a whopping 140 kilowatts -- a staggering 84% drop in overall energy use.
But power density in the new racks is much higher: Each consumes 28 kW of electricity, versus 24 kW for the previous generation. Every watt of power consumed is transformed into heat that must be removed from each rack -- and from the data center.
The new racks are equipped with 84 server blades, each with two quad-core processors and 32GB of RAM. They are powerful enough to displace seven racks of older BladeCenter servers that the special-effects company purchased about three years ago for its image-processing farm.
To cool each new 42U rack, ILM's air conditioning system must remove more heat than would be produced by nine household ovens running at the highest temperature setting.
These days, most new data centers have been designed to support an average density of 100 to 200 watts per square foot, and the typical cabinet is about 4 kW, says Peter Gross, vice president and general manager of Hewlett-Packard Co.'s Critical Facilities Services. A data center designed for 200 watts per square foot can support an average rack density of about 5 kW. With carefully engineered airflow optimizations, a room air conditioning system can support some racks at up to 25 kW, he says.
At 28 kW per rack, ILM is at the upper limit of what can be cooled with today's computer room air conditioning systems, says Roger Schmidt, an IBM fellow and chief engineer for data center efficiency. "You're hitting the extreme at 30 kW. It would be a struggle to go a whole lot further," he says.
Is This Sustainable?
The question is, what happens next? "In the future, are watts going up so high that clients can't put that box anywhere in their data centers and cope with the power and cooling? We're wrestling with that now," Schmidt says. High-density computing beyond 30 kW will have to rely on water-based cooling, he says. But other experts say that data center economics may make it cheaper for many organizations to spread out servers rather than concentrate them in racks with ever-higher energy densities.
Kevin Clark, director of information technologies at ILM, likes the gains in processing power and energy efficiency he has achieved with the new BladeCenters, which have followed industry trends to deliver more bang for the buck. According to IDC, the average server price since 2004 has dropped 18%, while the cost per core has dropped by 70%, to $715.
But Clark wonders whether continually doubling compute density is sustainable. "If you double the density on our current infrastructure, from a cooling perspective, it's going to be difficult to manage," he says.
Microsoft Corp. Global Foundation Services
He's not the only one who's concerned. For more than 40 years, the computer industry's business model has been built on the assumption that Moore's Law will prevail and that compute density will double every two years in perpetuity. Now some engineers and data center designers question whether that's feasible -- and whether a threshold has been reached.
The threshold isn't just about whether chip makers can overcome the technical challenges of packing transistors even more densely, but whether it will be economical to run large numbers of extremely high density server racks in modern data centers.
- Hadoop for Dummies Today, organizations in every industry are being showered with imposing quantities of new information. Along with traditional sources, many more data channels and...
- The Top Five Ways to Get Started with Big Data Despite the increased focus on big data over the past few years, most organizations are still talking about what big data is rather...
- Data Warehouse Augmentation: The Queryable Data Store While organizations have, to date, been busy exploring and experimenting, they are now beginning to focus on using big data technologies to solve...
- The IBM Big Data Platform IBM is unique in having developed an enterprise class big data platform that allows you to address the full spectrum of big data...
- Live Webcast Best Practices: How to Improve Business Continuity with Virtualization VMware solutions include a range of business continuity capabilities to help ensure availability for applications across your virtualized environment. Learn More>>
- Cloud Knowledge Vault Learn how your organization can benefit from the scalability, flexibility, and performance that the cloud offers through the short videos and other resources...
- Endpoint Data Management: Protecting the Perimeter of the Internet of Things Not surprisingly, "Internet of Things" (IoT) and Big Data present new challenges AND opportunities for enterprise IT. Teams need to harness, secure and... All Data Center White Papers | Webcasts