When it comes to storage, Facebook is learning to do more with less.
For backup copies of older content, the social network is building "cold storage" facilities that are designed to keep data available without some of the expensive, power-sucking features found in a traditional data center. Facebook says it's built in strong protection against data loss while reducing the overhead of additional storage.
These are data centers designed to hold more than an exabyte of data -- 1,000 petabytes -- with no redundant electrical systems, while consuming less than one-sixth as much power as a conventional facility. And they store all that data on cheap, consumer-grade media.
At the heart of the system is software, including a program that keep most disks idle most of the time and one that can reconstruct a lost or corrupted file without having a full duplicate stored in the data center, the company says.
When Facebook builds a user's News Feed or Timeline, it gets the photos, videos and other elements in it from so-called "hot" storage in data centers located around the world. If a data center with hot storage fails, another one can take over and deliver the bits without users noticing, said Kestutis Patiejunas, a Facebook software engineer.
Cold storage comes in when that failed data center has to be brought back online and all its data restored. In the past year, Facebook has opened two such facilities, in Prineville, Oregon, and Forest City, North Carolina. Facebook had a chance to design and build them with new technology from the ground up, and it implemented new ways of saving space and power.
The centers initially have been built around racks of 4TB hard disk drives, though Facebook could use the same techniques with Blu-Ray discs and cheap flash, two other kinds of media that it's exploring for cold storage, Patiejunas said.
Facebook designed the hardware using the Open Vault specification from its Open Compute Project but modified it for the new centers.
For one thing, the company set up the systems so that in each tray, only one hard drive could be running at any given time. It wrote software to manage the order in which bits are accessed so that the activity in any given tray was only happening on one disk at a time, Patiejunas said. With fewer disks running, the system can get by on less power and stay cool with fewer fans, while the task set out for the cold storage -- replenishing data to the hot storage that feeds users directly -- still runs fast enough.
Each storage node has just four fans, down from the usual six. Power supplies, which are concentrated on their own shelves in the rack, are also cut down: There's just one power shelf in a rack instead of three, and that rack has just five power supplies instead of seven.
Facebook also found a way to protect against data loss without having to keep multiple extra copies of each file. Instead of putting full copies of all the data on disks, Facebook made the data redundant virtually. It used Reed-Solomon coding, a decades-old technique used in RAID systems, which can break up data into pieces and reconstruct the whole of it using just some of the parts.
Facebook implemented this across multiple systems, so a failure that takes a drive offline in one part of the facility can be corrected with data from another area. Reconstructing the data takes compute cycles, and doing it across many systems takes network capacity, but Facebook wanted those options in its toolkit in addition to just adding more storage, Patiejunas said.
With this so-called erasure coding, the company can provide the equivalent of seven or eight extra copies of each bit of data while using just 1.4 times the capacity that a single copy of the data would take up. In other words, without even one extra copy of the full data to turn to in case of failure, Facebook calculates it can protect the content as well as it could with multiple backups of backups.
Meanwhile, because cold storage holds older content that users may not be looking at much anymore, Facebook runs software in the background to scan all data for "bit rot," a kind of corruption that can happen while bits sit unused.
The scale of all this is big, and getting bigger. The two cold storage centers already hold hundreds of petabytes of data, and just one "data hall" -- one of the big rooms within each center -- ultimately can hold as much as one exabyte. The system is designed to stay just as efficient as it grows to that scale, Facebook says.