EMC adds unstructured big-data analytics to Greenplum platform
Offers a 1,000-plus-node test bed for Hadoop developers
Computerworld - EMC's Greenplum subsidiary today announced a new capability in its Apache Hadoop Data Computing Appliance (DCA) that allows users to mix and match unstructured and structured data analytics platforms.
EMC also announced its Greenplum Analytics Workbench, a 1,000-plus-node test bed for software integration tests of Apache Hadoop software.
The test bed provides the Hadoop open-source community with the testing resources to quickly identify bugs, stabilize new releases and optimize hardware configurations in an effort to speed up the innovation of Hadoop. All testing and results will be given back to the Apache Software Foundation and the open-source community. EMC's testing will be planned in coordination with the Apache Hadoop project. Hadoop is an open-source software platform, originally developed by Google, for analyzing large quantities of data.
On its Greenplum appliance product front, EMC introduced the Modular Data Computing Appliance, which allows users to combine a massively parallel processing relational database with enterprise-class Apache Hadoop in a single, unified appliance to achieve structured and unstructured data processing.
Greenplum introduced the DCA in October 2010. An updated version of the DCA that included a Hadoop appliance was released this past May.
The Greenplum HD (Hadoop) DCA is built on top of Intel X86 servers and uses both a structured database built by Greenplum, which EMC acquired last year, and the Apache open-source version of Hadoop. The older version of the appliance is based on Sun Fire x64-based servers.
According to Scott Yara, co-founder of Greenplum and vice president of products for EMC's Data Computing Division, administrators can read and write files in parallel from Greenplum to HDFS (Hadoop File System), enabling rapid data sharing. Cross-platform analysis can be performed using Greenplum SQL and advanced analytic functions accessing data on HDFS.
The new Modular DCA adds high-performance computing modules in the form of SAS Institute's In-Memory Analytics software, allowing it to serve up both structured data, such as databases, and unstructured file data, according to Yara.
"The main change is that it can perform parallel processing using server memory through the use of business analytics software [from SAS]," Yara said. "We wanted to offer a Lego-building-block-type architecture."
Through the use of the SAS software, structured and unstructured data can exist on multiple x86 hosts, the purpose of which is to allow users to perform computations in memory on each server node in a clustered configuration.
"The power of the appliance is that it can perform all these complex problems in parallel," Yara said.
The new Modular DCA is undergoing product trials and is expected to be available by the end of this year, Yara said.
Lucas Mearian covers storage, disaster recovery and business continuity, financial services infrastructure and healthcare IT for Computerworld. Follow Lucas on Twitter at @lucasmearian or subscribe to Lucas's RSS feed . His e-mail address is email@example.com.
Read more about Data Storage in Computerworld's Data Storage Topic Center.
- What is this "File Sync" Thing and Why Should I Care About It? All of a sudden, getting a file from your work laptop to your iPad became as simple as clicking "Save." So it's no...
- The Keys to Securing Data in a Collaborative Workplace Losing data is costly. IT professionals have spent years learning how to protect their organizations from hackers, but how do you ward off...
- Cloud-to-Cloud Backup Case Study: AMAG Pharmaceuticals As an IT pioneer in the pharmaceuticals industry, AMAG realized that SaaS backup and recovery would give them the confidence to fully embrace...
- 9 Essentials for a Complete Cloud-to-Cloud Backup Solution In 9 Essentials for a Complete Cloud-to-Cloud Backup Solution, we'll walk you through potential sources of data loss in the cloud and provide...
- The Key to Happiness: Throw out Your Data Warehouse In this webinar, Kerry Reitnauer, Director, Solution Architect at FairPoint Communications will discuss the challenges the data warehouse brought, how they migrated to...
- The Foundation You Need to Build a Better Storage Infrastructure Watch this webcast to hear how you can maximize the economics of your data center by modifying your storage footprint and power usage... All Data Storage White Papers | Webcasts