Facebook's big data plans include warehouses, faster analytics
An engineer reveals during an industry conference how the site is working to make its backend data processing more efficient
IDG News Service - Facebook may treasure the data it has on its one billion-plus users for its advertising returns, but the analysis the site performs on that data is expected to continue to pose numerous challenges over the coming year, an engineer said.
The problems, which Facebook has been forced to grapple with "much sooner than the broader industry," include figuring out more efficient ways to process user behavior on the site, how to better access and consolidate different types of data across Facebook's multiple data centers, and devising new open source software systems to process that data, Ravi Murthy, who manages Facebook's analytics infrastructure, said Tuesday.
"Facebook is a data company, and the most obvious thing people think of on that front is ads targeting," he said at an industry conference in San Francisco, during a talk on Facebook's back-end infrastructure, data analytics and open source projects.
"But it goes deeper than this," he said.
One major area of behind-the-scenes work relates to Facebook's analytics infrastructure, which is designed to accelerate product development and improve the user experience through deep analysis of all the available data, whether it consists of the actions users take on the site like posting status updates or which applications they use within Facebook on different devices.
Facebook uses several different open source software systems known as Hadoop, Corona and Prism to process and analyze its data, which the company will focus on making faster and more efficient over the next six to twelve months, Murthy said.
Many of the company's challenges are tied to what Facebook refers to as its data warehouse, which combines data from multiple sources into a database where user activity can be analyzed in the aggregate, such as by giving a daily report on the number of photos that have been tagged in a specific country, or looking at how many users in a certain area have engaged with pages that were recommended to them.
The analysis is designed to optimize the user experiences and find out what users like and don't like, but it also is becoming more taxing as Facebook is able to access more and more data about its users, Murthy said. Currently, the Facebook warehouse takes in 500 terabytes of new data every day, or 500,000 gigabytes. The warehouse has grown nearly 4,000-times in size over the last four years, "way ahead of Facebook's user growth," Murthy said.
To deal with these issues, Facebook has developed its Prism software system, which is designed to perform key analysis functions across the company's data centers worldwide, and split up the analyses into "chunks," Murthy said. That way, performing an analysis on, say, some metric related to users' news feeds won't clog up the warehouse more generally.
- Software Asset Management: Ensuring Today's Assets Today's trends like BYOD and SaaS are new and exciting in terms of how they will help make our jobs more productive but...
- Trends Shaping Software Management: 2014 Most IT executives recognize the relationship between mobile computing and worker productivity, and have long issued notebook computers and other mobile devices to...
- Software Asset Management: Pay Attention or Pay Up There is a wide range of options for managing software assets, from in-house solutions to the cloud to managed services providers. Read this...
- 13 Reasons to Move to Adobe Creative Cloud One of the big advantages Adobe Creative Cloud for teams offers over Adobe Creative Suite 6 perpetual software is the ability to continually...
- Capturing Data in Motion: Delivering Real-Time Insight from Data Streams This webcast will help organizations of all types and sizes learn about a technology and business strategy for tapping into the wealth of...
- The Next Generation of Big Data: New IBM Information Management Cloud Solutions Learn about IBM's new and expanded Information Management capabilities now delivered in the cloud, including: Hadoop based analytics, stream processing, in-memory computing, data... All Business Intelligence/Analytics White Papers | Webcasts
Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!