The business value arising from Moore’s Law, which says the number of transistors on a chip will double about every two years, is being turned on its head by the rising costs of providing power, cooling and other facility support for servers. Those costs now exceed the price of the computing hardware, says Ken Brill, founder and executive director of The Uptime Institute Inc., a consortium of corporations that run very large data centers. In an interview with Computerworld last week, he talked about those escalating costs and outlined what IT managers can do to improve data center energy efficiency.
Ken Brill, founder and executive director of The Uptime Institute Inc.
What’s the biggest threat facing data centers? The economic breakdown of Moore’s Law.
What do you mean by that? Historically, facilities costs have been 3% of IT’s total budget, but the economic breakdown of Moore’s Law means that facilities costs [including power consumption] are going to be climbing to 5%, 10% and higher. That will change the economics of IT. The business question becomes, Will IT get more money so the increasing portion of the budget that facilities represents doesn’t crowd out other IT initiatives? Or will the increasing facilities [costs] result in curtailing other things? That’s the economic truncation of Moore’s Law.
What’s the business cost of the breakdown of Moore’s Law? The business cost is that the return on investment that people think they are going to get is not going to be there.
How can business and facilities representatives work to adapt to increasing facilities costs? The application justification process needs to change so it includes all the cost. Typically, you are looking at just the IT cost of the hardware and the cost of running that hardware.
Companies can’t eliminate the use of larger and denser servers, so how can they change the economics? First, when buying equipment, look not only at performance per dollar, but [also] look at performance per watt. Be sharper on buying. IT has to become conscious of energy efficiency and put pressure on the manufacturers to be more energy-aware. That’s going to benefit everybody in the long term. A second thing is to kill dead servers — servers that are still running but not actively doing anything.
How much of an issue are dead servers? From 10% to 30% of the load in a data center is represented by servers that aren’t doing anything. By turning off those servers, you can cut your energy consumption. The problem is there is no incentive — there is risk but no incentive — to turn them off.
Why isn’t the incentive to turn off unused servers apparent? Who has to turn the server off? The data center manager. He’s measured on availability, not on costs. You discover the 10% to 30% of dead servers whenever you move a data center, because that’s the only time you have to turn stuff off.
What else can companies do to cut data center costs? Consolidate multiple servers onto a bigger platform, which will be more energy-efficient. [And] IT can enable the power-saving features that are now built into many new servers. Finally, IT managers can reduce “bloatware” — software with inefficient code requiring a bigger processor to get through it.
Are companies wasting money on cooling systems? Most data centers are consuming from 20% to 40% more energy then they should because the cooling systems are not well optimized. For instance, here is a common issue in a computing room with multiple cooling units: You may see that one unit is dehumidifying and the unit immediately adjacent to it is humidifying, so you have dueling cooling units.
What can be done to cost-efficiently cool systems in data centers? In 2000, at 500 watts to 1,000 watts per cabinet, you could do anything and successfully cool it. You could be totally incompetent, [and] you could successfully cool it. You may not have done it energy-efficiently, but that was never measured, so nobody knew how badly it was done. As the density per cabinet increases, the mask is ripped off and a user’s responsibility becomes apparent. For computer rooms with raised floors, the institute has promoted hot aisles and cold aisles for over 10 years. It’s accepted as an optimal solution. But you go into computer room after computer room and you see that the equipment is lined up facing up one direction. As a result, people have hot spots. And if you have hot spots, you go out and buy more air conditioning.