Data center density hits the wall

Why the era of packing more servers into the same space may have to end

1 2 3 4 5 6 Page 6
Page 6 of 6

But Belady says running data center gear even hotter than 81 degrees could result in enormous efficiency gains.

"Once you start going to higher temperatures, you open up new opportunities to use outside air and you can eliminate a lot of the chillers ... but you can't go as dense," he says. Some parts of the country already turn off chillers in the winter and use economizers, which use outside air and air-to-air or air-to-water heat exchangers, to provide "free cooling" to the data center.

If IT equipment could operate at 95 degrees, most data centers in the U.S. could be cooled with air-side economizers almost year-round, he argues. And, he adds, "if I could operate at 120 degrees ... I could run anywhere in the world with no air conditioning requirements. That would completely change the game if we thought of it this way." Unfortunately, there are a few roadblocks to getting there. (See "The case for, and against, running servers hotter.")

Belady wants equipment to be tougher, but he also thinks servers are more resilient than most administrators realize. He believes that the industry needs to rethink the kinds of highly controlled environments in which distributed computing systems are hosted today.

The ideal strategy, he says, is to develop systems that optimize each rack for a specific power density and manage workloads to ensure that each cabinet hits that number all the time. In this way, both power and cooling resources would be used efficiently, with no waste from under- or overutilization. "If you don't utilize your infrastructure, that's actually a bigger problem from a sustainability standpoint than overutilization," he says.

What's next

Belady sees a bifurcation coming in the market. High-performance computing will go to water-based cooling while the rest of the enterprise data center -- and Internet-based data centers like Microsoft's -- will stay with air but move into locations where space and power costs are cheaper so they can scale out.

Paul Prince, CTO of the enterprise product group at Dell, doesn't think most data centers will hit the power-density wall anytime soon. The average power density per rack is still manageable with room air, and he says hot aisle/cold aisle designs and containment systems that create "super-aggressive cooling zones" will help data centers keep up. Yes, densities will continue their gradual upward arc. But, he says, it will be incremental. "I don't see it falling off a cliff."

At ILM, Clark sees the move to water, in the form of closely coupled cooling, as inevitable. Clark admits that he, and most of his peers, are uncomfortable with the idea of bringing water into the data center. But he thinks that high-performance data centers like his will have to adapt. "We're going to get pushed out of our comfort zone," he says. "But we're going to get over that pretty quickly."

Robert L. Mitchell writes technology-focused features for Computerworld. Follow Rob on Twitter at http://twitter.com/rmitch, send e-mail to rmitchell@computerworld.com or subscribe to his RSS feed.

Copyright © 2010 IDG Communications, Inc.

1 2 3 4 5 6 Page 6
Page 6 of 6
Bing’s AI chatbot came to work for me. I had to fire it.
Shop Tech Products at Amazon