Turning Up the Heat to Save Energy

Scottrade found that its overcooled data center was a resource drain.

The temperature's rising in online brokerage Scottrade Inc.'s data center -- and that's a good thing. The move has allowed the St. Louis-based company to reap enormous energy savings while increasing reliability.

Six months ago, CIO Ian Patterson hired the engineering firm Glumac to construct a computational fluid dymanics (CFD) model of Scottrade's data center. The model provided a complete picture of thermal airflows.

Samuel Graves, chief data center mechanical engineer at Glumac, oversaw the effort. "Much can be learned from a thermal CFD model, and going forward, the model becomes an excellent tool to help determine the effectiveness of potential solutions," he says.

As is the case in many large data centers, Scottrade was overcooling the room. The solution: Fix the airflow problems and hot zones in its hot aisle/cold aisle configuration and turn up the computer room air conditioning (CRAC) unit's thermostat. That sounds scary, but Patterson says implementing the recommendations cut power consumption by 8% and improved equipment reliability -- all without affecting the performance of the data center.

Power and cooling infrastructures are a large piece of the data center's overall operating cost. The hard dollar savings from some fairly straightforward changes were "significant," Patterson says.

Scottrade didn't just reap those savings by retrofitting an old, poorly designed facility. Quite the contrary, Patterson achieved the efficiency gains in a state-of-the-art, 34,000-square-foot data center that Scottrade had rolled out in 2007. The cost benefits weren't just limited to power and cooling bills: Scottrade also reduced the load on backup power systems and reduced the number of backup batteries needed.

The savings that Scottrade achieved are actually on the low side, says Graves. "Scottrade was already doing a lot of things right," he adds, noting that Glumac has seen some data centers that achieve a 25% decrease in cooling costs when tuned properly.

The CFD model identified three key areas for improving efficiency. First, it found that a "thermocline," or plane of warmer air, was floating in the upper half of the data center space. That hot layer started at a height of about five and a half to six feet and extended all the way to the 10-foot ceiling. Thus, the equipment in Scottrade's top racks was in the hot-air cloud.

The second issue was the configuration of the racks themselves. Not all racks were fully populated, but equipment was always concentrated at the top of the racks, where it was subject to those higher temperatures. In fact, says Patterson, the hottest-running servers tended to be mounted at the top, where cooling efficiency was lowest. To address that, Scottrade had lowered the CRAC system temperature settings, overchilling the rest of the room.

Related:
1 2 Page 1
Page 1 of 2
  
Shop Tech Products at Amazon