Google claims MapReduce sets data-sorting record, topping Yahoo, conventional databases
Computerworld - Google Inc. late last week claimed that results of in-house data-sorting tests bolster its claims that its MapReduce technology can manipulate more data faster than any conventional database.
According to a Friday afternoon blog post by Grzegorz Czajkowski, a member of Google's systems infrastructure team, MapReduce recently sorted 1 terabyte of data in 68 seconds, or about a third of the time Yahoo Inc. achieved this summer.
Sorting or rearranging data is one of the most basic functions of a spreadsheet, database or other data-manipulation software.
Google used 1,000 servers running MapReduce in parallel to sort the data, versus 910 for Yahoo, according to Czajowksi.
Google also tested MapReduce's ability to sort 1 petabyte, or 1,000 TB, of data. That is equivalent to 12 times the amount of archived Web data in the U.S. Library of Congress as of May 2008, according to Google.
Using 4,000 servers, which is likely a small fraction of Google's entire worldwide server infrastructure, MapReduce took 6 hours, 2 minutes to sort 1PB, according to Czajkowski.
"We're not aware of any other sorting experiment at this scale and are obviously very excited to be able to process so much data so quickly," he wrote.
Czajkowski did not say when the tests were done. He did reveal that as of early January this year, Google was processing an average of 20 PB total per day.
Google's announcement appeared to be deliberately timed to coincide with a speech by a noted database expert and MapReduce critic David DeWitt.
A former computer science professor at the University of Wisconsin, Madison, DeWitt joined Microsoft this spring to run a new research lab being created on the Madison campus.
The lab will focus on helping Microsoft's SQL Server "scale out" in order to run on hundreds or thousands of servers at a time. That will allow customers to run parallel database clusters similar technically to Google's, though nowhere near the latter's scale.
Early this year, DeWitt, along with database industry legend Michael Stonebraker, co-wrote a blog arguing that MapReduce was a "sub-optimal ... not novel" type of database that lacked many features that modern database administrators and developers take for granted and that was unworthy of the hype it has received.
In an interview last week with Computerworld, DeWitt praised MapReduce's scalability and hardiness.
But DeWitt also stood firm on MapReduce's shortcomings. He and Stonebraker are also submitting a paper to the Association of Computing Machinery (ACM) that compares the performance of several databases, IBM's DB2 and Stonebraker's Vertica, with MapReduce and another similar nonrelational data engine, Apache Hadoop. That paper may be publicly available as early as late January, said DeWitt.
DeWitt gave a keynote speech on Friday at the Professional Assocation for SQL Server's (PASS) conference in Seattle.
He did not directly criticize MapReduce during his PASS keynote speech, according to blog reports.
Read more about Databases in Computerworld's Databases Topic Center.
- Patient Portals: A Platform for Connecting Communities of Care Connecting patient health data across the care continuum is essential to achieve improved care, increased access to personal health records and lowered costs.
- Aberdeen Group: Marketing Analytics for Manufacturing: Forging Customer Insights There are no recalls for poor marketing. Manufacturers need to get their customer intelligence and messaging right the first time. Learn how.
- The Big Data Opportunity for HR and Finance If CEOs, CFOs, CIOs, and CHROs want to drive their businesses forward, they will need to quickly recognize the enormous value of big...
- Path Selection Infographic Path Selection Infographic
- Cloud Knowledge Vault Learn how your organization can benefit from the scalability, flexibility, and performance that the cloud offers through the short videos and other resources...
- LIVE EVENT: 5/7, The End of Data Protection As We Know It. Introducing a Next Generation Data Protection Architecture. Traditional backup is going away, but where does this leave end-users? All Databases White Papers | Webcasts