Hadoop pitched for business intelligence

Written as Web indexing software, Hadoop is being marketed as a tool for complex business intelligence tasks

While it began life as a tool for indexing Web pages, the open source Hadoop framework is being marketed as a tool that could house and analyze vast amounts of data with the kind of proportions that would quickly overwhelm traditional database systems and data warehouses.

Tuesday in New York at the Hadoop World 2010 conference, a number of organizations plan to discuss how the framework could be used within the enterprise. Among the possible uses being discussed: Business intelligence (BI).

"Hadoop is a phenomenal number-crunching engine," said Jake Cornelius, who heads up product management at Pentaho, a BI software provider. He admits it wouldn't be used in all cases of BI, but for really large or complex ones, it could come in handy.

"There really is a small subset of scenarios that we think of as big data problems, where you really have to start looking at Hadoop to solve these big problems," Cornelius said.

Others agree. "If you look at large corporations today, they are dropping data on the floor because they don't have a place to put it," said Eric Baldeschwieler, Yahoo's vice president of Hadoop software development. Running on commodity hardware, a Hadoop cluster could provide a low-cost expansive platform for just such data.

An increasing number of software companies are offering more support for the technology, which could attract more business users. For instance, Yahoo has just released a number of enhancements to make the technology more palatable for enterprise use. On Tuesday, Pentaho released an integration suite for enterprise BI users, called Pentaho for Hadoop.

Yahoo, in fact, is one of Hadoop's biggest users. The company uses the technology in a variety of ways, including as a sort of a very large data warehouse, Baldeschwieler said. Hadoop clusters hold massive log files of what stories and sections users click on. Advertisement activity is also stored on Hadoop clusters, as is a listing of all the content and articles Yahoo publishes.

"It is a hugely varied set of stuff, and the challenge is that when you try to build new products it often makes a lot of sense to ask questions that combine all those different things," Baldeschwieler said.

Recently, Yahoo released a number of enhancements to Hadoop to make it more of an enterprise-ready BI platform. For instance, Yahoo has added security features in its own distribution that would allow Hadoop to span across multiple firewalls.

"Before our engineering, the only way you could put sensitive data onto a Hadoop cluster would be to firewall the cluster and control access," Baldeschwieler said.

The company's engineers have also updated a Hadoop workflow scheduler called Oozie and Pig, a high-level programming environment for running MapReduce jobs.

Yahoo started dedicating large amounts of engineering work to refining Hadoop around 2006, said Doug Cutting, a search specialist who created Hadoop. "Yahoo had lots of interesting data across the company that be correlated in various way, but it existed in separated systems," said Cutting, who now works for Hadoop distribution provider Cloudera. Hadoop promised an easy way for Yahoo to do cross-system analysis of data.

Cutting built Hadoop as part of a part-time project building open source search software (he named the technology after his son's stuffed elephant). He knew he needed distributed servers given the enormity of the project.

"We realized that doing reliable distributed software was really hard," Cutting said. "Every step in which anything is distributed, there are myriad ways in which it could fail. You have to think about how to handle each failure successfully."

Hadoop offers a unique tool in some circumstances, said Curt Monash of Monash Research. "Hadoop is a great tool for organizing and condensing large amounts of data before it is put into a relational database," he said.

It is also a good tool for companies to analyze relationships between people or things, a practice often known as "social graph analysis," Monash said. "Traditional relational databases have a difficult time with this, because each hop along the graph exponentially increases the amount of work that needs to be done," he said.

But there are tradeoffs with the technology. For one, you may not want to use it for real time data analysis.

Cornelius admits Hadoop has latency issues. Because of its distributed nature, Hadoop is not as fast as other BI systems. But, Cornelius and others argue that Hadoop should not be considered as an alternative to a transactional database system or a data warehouse, but rather something that can do tasks that these technologies would struggle to execute.

"It's not a database. It's a different kind of data storage and analytics platform. If you have a relational database problem, you should go buy Oracle or DB2," agreed Mike Olson, Cloudera CEO. To better pursue the BI market, Cloudera has forged partnerships with Pentaho and data warehouse provider Teradata.

"If you want to combine complex unstructured data from multiple sources, if you want to do sophisticated pattern detection, then Hadoop is your only choice," Olson said.

IDG News Service U.S. correspondent Chris Kanaracus contributed to this report.

Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com

Copyright © 2010 IDG Communications, Inc.

7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon