IT systems run in exposed shed to prove reliability point

Through winter, spring and summer temperatures, the equipment keeps running. It even survives a sawdust challenge.

In an experiment that began in January, servers, networking gear and storage systems have been running in a simple shed without failure.

This experiment is giving David Filas, a data center engineer at healthcare provider Trinity Health, the ammunition he needs to argue that IT equipment is a lot tougher than most people think.

Through winter, spring and summer, these decommissioned systems have continued to run despite large variations in temperature and humidity. And the uptime of the systems has been better than what Google and Amazon.com have delivered so far this year.

Filas wants to convince IT administrators at his company, which runs 47 hospitals and other healthcare facilities, that it's OK to raise the temperature in data centers. But the IT staff has been reluctant to do so, he said.

The project was inspired by something Microsoft did a few years back. From November 2007 to June 2008, Microsoft employees ran five Hewlett-Packard servers in a tent and reported "zero failures, or 100% uptime."

Filas is running his equipment in a generator shed at the healthcare provider's headquarters in Novi, Mich., a suburb of Detroit.

A block heater on the generator provides some warmth, but otherwise, the equipment is "more or less exposed to the same temperature and humidity conditions as the outdoors," said Filas, who presented his work at the Afcom data center conference last week in Orlando.

The temperature inside the shed has ranged from 31 degrees Fahrenheit to nearly 105 degrees. The relative humidity has ranged from nearly 8% to about 83%. But the door of the shed has been accidentally left open a few times, including once when the temperature reached 5 degrees below zero.

Filas even tossed sawdust in the shed to make a point about the ability of these systems to handle dust. The dust issue pops up when arguments are made for using outside air to cool data centers, he said.

"I'm trying to dispel the myth that the data center has to be a clean room, because it doesn't," said Filas. "Today's electronics are extremely resilient."

The equipment running in the experiment was pulled out of production three to four years ago. There are about a dozen pieces of equipment in the test, including HP servers, Cisco switches and an IBM disk array.

The plan had been to keep the systems running until January, but Filas said he may extend that and add some workloads to the systems to address criticism that it isn't a true test. He is considering networking the equipment and putting it under a heavy load.

Filas didn't expect the systems to fail but said that he is nonetheless surprised by how well the mechanical components have held up -- the hard drives in particular. There hasn't been a single hard-drive failure, he noted.

The Cisco equipment is very resilient, Filas added. Some Cisco equipment has a manufacturer's upper temperature limit of 104 degrees, but Filas said he knows from experience that these systems can handle much more.

"Through unfortunate cooling outages, we have even had everything in a data center shut down except the Cisco equipment. The temperature was 117 degrees, and the Cisco equipment was purring away," Filas said.

Filas argues that there is no reason why the inlet temperatures on equipment cannot be between 80 to 82 degrees, which has been his goal in the main data center. He said he considers that an ideal temperature range for a data center, adding that the range even includes a little bit of a safety margin.

The American Society of Heating, Refrigerating and Air-Conditioning Engineers has also been raising temperature recommendations as a result of improvements in data center equipment, to an upper limit of 81 degrees, and is expected to increase the ranges again.

"I want my IT staff to be more comfortable with the higher temperatures. They are accustomed to having it being 65 degrees in the data center, and they get very nervous when I dial up the temperature, even to the mid-70s," Filas said. "I'm trying to dispel the myth among my own staff that it has to be that cold, because it doesn't."

Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at  @DCgov or subscribe to Patrick's RSS feed . His e-mail address is pthibodeau@computerworld.com.

FREE Computerworld Insider Guide: IT Certification Study Tips
Join the discussion
Be the first to comment on this article. Our Commenting Policies