Skip the navigation

Facebook, eBay share energy tips for data centers

By James Niccolai
October 15, 2010 02:53 PM ET

IDG News Service - Facebook and eBay shared some tips this week for improving efficiency and cutting energy bills in data centers.

Some of their tricks will be applicable only for very large data centers -- Facebook persuaded a server vendor to provide it with custom firmware to control fan speeds, for example -- but others are relevant for mere mortals who don't buy servers by the ton.

"What we did here isn't rocket science," said Jay Park, Facebook's director of data center engineering. "We applied very basic thermal theory." He was on a panel discussing "holistic approaches to reducing energy use" at the Silicon Valley Leadership Group's Data Center Efficiency Summit, which took place Wednesday at Brocade's headquarters in Silicon Valley.

Facebook was able to reduce its costs by $230,000 a year at a 56,000-square-foot data center in Santa Clara, Calif., largely through better air flow management, Park said. It also got a $294,000 rebate from its electric utility, Silicon Valley Power, to offset the cost of investments it made in energy efficiency.

The cooling systems that prevent IT gear from overheating are among the biggest costs for a data center, and managing the flow of warm and cold air is key to their efficiency. Facebook collected temperature and humidity readings around the Santa Clara facility and used a computational fluid dynamics program to study the movement of air. A CFD analysis can reveal areas of the data center that are too hot or too cold, and run "what if" scenarios to show what changes might be effective.

Facebook had warm air mixing with cold air above and around server aisles, so it did a cold aisle containment project -- closing the tops and ends of server aisles with fire retardant plastic to prevent cold air blowing up through perforated floor tiles from escaping.

The cold aisle containment made its cooling systems more efficient, which allowed it to shut down 15 of its Computer Room Air Handlers (CRAHs), reducing its energy draw by 114kW. It also enabled it to raise the air temperature at the CRAH inlets from 72 degrees to 81 degrees F.

It then looked at its servers and decided the fans inside each machine were spinning faster than they needed to. Manufacturers set fan speeds to accommodate maximum server loads, but that's often faster than needed and wastes energy, Park said.

Because it's such a big customer, Facebook could work with its vendor to reduce the fan speed through a firmware update. That saved 3 watts per server, which adds up fast in a large data.

Facebook has some advantages because of its size. It has long-term leases for most of its data centers and is often the sole occupant, so it can negotiate with owners and doesn't have to worry about other tenants' equipment.

Reprinted with permission from IDG.net. Story copyright 2014 International Data Group. All rights reserved.
Our Commenting Policies
Internet of Things: Get the latest!
Internet of Things

Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!