With the Hadoop Summit taking place this week in San Jose, California, vendors supporting the open-source data-analysis platform are rushing new products to market.
Over 2,100 attendees are expected at the conference, which has been sponsored by IT heavyweights such as Cisco, Facebook, IBM, Microsoft, Splunk and VMware. This is the fifth annual summit, and while last year's conference felt like it was aimed at developers, this year's event is far more customer-focused, observed Jack Norris, the vice president of marketing for Hadoop distributor MapR, who is on site on the conference show floor.
Created in 2005 to analyze large amounts of Web traffic logs, Hadoop is increasingly being used for analyzing swaths of unstructured data too large and unwieldy to be crammed into a relational database or enterprise data warehouse -- data often referred to as big data. In survey results released Tuesday by IT consulting company Capgenimi, 58 percent of 600 senior business and IT executives had stated that they plan to invest in big data systems, such as Hadoop, over the next three years.
Apache Hadoop itself is an open-source project, so many of the new vendor-led enhancements focused on making the software easier to use and deploy, as well as making it more compatible with other software.
Updates have arrived this week for all three major Hadoop distributions -- Cloudera, Hortonworks and MapR -- as well as for adjacent data analysis technologies from companies such as Teradata and Pentaho.
MapR Technologies has released the second major version of its Hadoop Distribution. This release comes in two editions, the basic M3 edition, which supports the use of Network File System (NFS) for easy deployability, and the M5 version, configured for high availability (HA) use.
Hadoop Distribution version 2 is the first to support multitenancy, which can offer a number of benefits. The management software can now support multiple clusters, offering administrators the ability to logically partition a physical cluster for different tasks. "When you start to expand the number of uses, being able to logically separate those become big requirement," Norris said.
Multitenancy also offers the ability for administrators to specify on which nodes a particular job can run, Norris said. "Certain data would benefit from being on certain hardware," such as solid-state disks, he said.
The software also compiles the log data for each node into a single node, which might help in troubleshooting through customized analysis tools or a set of histograms and bar charts now offered by the software.
"If a job is taking a lot longer to run, the administrator wants to see why," Norris said.
MapR also announced that its Hadoop distribution is now available through Amazon Web Service's Elastic MapReduce Service (EMR). MapR's is the first external distribution that Amazon uses, Norris said. Organizations could use the company's M5 service on Amazon as a way of backing up an internal Hadoop on the cloud, or moving internal jobs to Amazon for additional bursts of processing power.
Cloudera and Hortonworks have also released new Hadoop packages this week. Cloudera has extended its package to handle data-processing algorithms other than the default MapReduce now in use. For its first full commercial release, Hortonworks emphasized a full set of lifecycle-management tools, as well as a metadata catalogue that should ease the interoperability with other data analysis software.
Those interested in running Hadoop in a virtual environment now have some help from VMware. The company has released open-source software, called Serengeti, that will allow administrators to deploy Hadoop nodes in virtual containers, which then can be managed through VMware's vCenter, said Fausto Ibarra, senior director of product management for VMware. Using Serengeti eliminates the need to configure the network settings of each node by hand, he said.
Also updating its Hadoop package is DataStax, which pairs Hadoop with its Cassandra nonrelational database. DataStax Enterprise (DSE) 2.1 will run 20 percent faster than the previous version, the company claims. Such speed is important for systems that mix time-sensitive transactional work and analysis. The software also includes the capability of spanning a Hadoop cluster across multiple data centers.
New packages are also being introduced that can tie Hadoop with other types of data-analysis platforms.
By the end of the year, data warehouse vendor Teradata will release a new query language, called SQL-H, for its Aster Database. SQL-H will allow users of its Aster MapReduce Appliance to query data stored in Hadoop Distributed File Systems (HDFS), without the need to work with MapReduce or HDFS directly. The software uses the metadata compiled in the open source Apache HCatalog project.
Business intelligence software provider Pentaho has announced that its analysis software has been certified by Dell to run on the company's Dell Apache Hadoop Solution, a package of servers with pre-installed with Hadoop and Dell's Crowbar management software. Users can study Hadoop data through the Pentaho's GUI (graphical user interface).
Joab Jackson covers enterprise software and general technology breaking news for The IDG News Service. Follow Joab on Twitter at @Joab_Jackson. Joab's e-mail address is Joab_Jackson@idg.com