Skip the navigation

Facebook's big data plans include warehouses, faster analytics

An engineer reveals during an industry conference how the site is working to make its backend data processing more efficient

By Zach Miners
April 30, 2013 09:06 PM ET

IDG News Service - Facebook may treasure the data it has on its one billion-plus users for its advertising returns, but the analysis the site performs on that data is expected to continue to pose numerous challenges over the coming year, an engineer said.

The problems, which Facebook has been forced to grapple with "much sooner than the broader industry," include figuring out more efficient ways to process user behavior on the site, how to better access and consolidate different types of data across Facebook's multiple data centers, and devising new open source software systems to process that data, Ravi Murthy, who manages Facebook's analytics infrastructure, said Tuesday.

"Facebook is a data company, and the most obvious thing people think of on that front is ads targeting," he said at an industry conference in San Francisco, during a talk on Facebook's back-end infrastructure, data analytics and open source projects.

"But it goes deeper than this," he said.

One major area of behind-the-scenes work relates to Facebook's analytics infrastructure, which is designed to accelerate product development and improve the user experience through deep analysis of all the available data, whether it consists of the actions users take on the site like posting status updates or which applications they use within Facebook on different devices.

Facebook uses several different open source software systems known as Hadoop, Corona and Prism to process and analyze its data, which the company will focus on making faster and more efficient over the next six to twelve months, Murthy said.

Many of the company's challenges are tied to what Facebook refers to as its data warehouse, which combines data from multiple sources into a database where user activity can be analyzed in the aggregate, such as by giving a daily report on the number of photos that have been tagged in a specific country, or looking at how many users in a certain area have engaged with pages that were recommended to them.

The analysis is designed to optimize the user experiences and find out what users like and don't like, but it also is becoming more taxing as Facebook is able to access more and more data about its users, Murthy said. Currently, the Facebook warehouse takes in 500 terabytes of new data every day, or 500,000 gigabytes. The warehouse has grown nearly 4,000-times in size over the last four years, "way ahead of Facebook's user growth," Murthy said.

To deal with these issues, Facebook has developed its Prism software system, which is designed to perform key analysis functions across the company's data centers worldwide, and split up the analyses into "chunks," Murthy said. That way, performing an analysis on, say, some metric related to users' news feeds won't clog up the warehouse more generally.

Reprinted with permission from IDG.net. Story copyright 2014 International Data Group. All rights reserved.
Our Commenting Policies
Internet of Things: Get the latest!
Internet of Things

Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!