EMC adds unstructured big-data analytics to Greenplum platform
Offers a 1,000-plus-node test bed for Hadoop developers
Computerworld - EMC's Greenplum subsidiary today announced a new capability in its Apache Hadoop Data Computing Appliance (DCA) that allows users to mix and match unstructured and structured data analytics platforms.
EMC also announced its Greenplum Analytics Workbench, a 1,000-plus-node test bed for software integration tests of Apache Hadoop software.
The test bed provides the Hadoop open-source community with the testing resources to quickly identify bugs, stabilize new releases and optimize hardware configurations in an effort to speed up the innovation of Hadoop. All testing and results will be given back to the Apache Software Foundation and the open-source community. EMC's testing will be planned in coordination with the Apache Hadoop project. Hadoop is an open-source software platform, originally developed by Google, for analyzing large quantities of data.
On its Greenplum appliance product front, EMC introduced the Modular Data Computing Appliance, which allows users to combine a massively parallel processing relational database with enterprise-class Apache Hadoop in a single, unified appliance to achieve structured and unstructured data processing.
Greenplum introduced the DCA in October 2010. An updated version of the DCA that included a Hadoop appliance was released this past May.
The Greenplum HD (Hadoop) DCA is built on top of Intel X86 servers and uses both a structured database built by Greenplum, which EMC acquired last year, and the Apache open-source version of Hadoop. The older version of the appliance is based on Sun Fire x64-based servers.
According to Scott Yara, co-founder of Greenplum and vice president of products for EMC's Data Computing Division, administrators can read and write files in parallel from Greenplum to HDFS (Hadoop File System), enabling rapid data sharing. Cross-platform analysis can be performed using Greenplum SQL and advanced analytic functions accessing data on HDFS.
The new Modular DCA adds high-performance computing modules in the form of SAS Institute's In-Memory Analytics software, allowing it to serve up both structured data, such as databases, and unstructured file data, according to Yara.
"The main change is that it can perform parallel processing using server memory through the use of business analytics software [from SAS]," Yara said. "We wanted to offer a Lego-building-block-type architecture."
Through the use of the SAS software, structured and unstructured data can exist on multiple x86 hosts, the purpose of which is to allow users to perform computations in memory on each server node in a clustered configuration.
"The power of the appliance is that it can perform all these complex problems in parallel," Yara said.
The new Modular DCA is undergoing product trials and is expected to be available by the end of this year, Yara said.
Lucas Mearian covers storage, disaster recovery and business continuity, financial services infrastructure and healthcare IT for Computerworld. Follow Lucas on Twitter at @lucasmearian or subscribe to Lucas's RSS feed . His e-mail address is firstname.lastname@example.org.
Read more about Data Storage in Computerworld's Data Storage Topic Center.
- Data Warehouse Augmentation: The Queryable Data Store While organizations have, to date, been busy exploring and experimenting, they are now beginning to focus on using big data technologies to solve...
- Rebranded Quadmark revamps its IT solutions with Google Apps Switching to Google Apps halved Quadmark's IT admin costs while achieving 10% time savings per employee. The global consulting firm now spends 80%...
- CrashPlan PROe Security Because mobile laptops often are connected to unsecured networks, a very high standard of security is required to ensure privacy.
- Protecting Digitalized Assets in Healthcare Healthcare providers face an urgent, internal battle every day: security and compliance versus productivity and service. For most healthcare organizations, the fight is...
- Live Webcast LIVE EVENT: 5/7, The End of Data Protection As We Know It. Introducing a Next Generation Data Protection Architecture. Traditional backup is going away, but where does this leave end-users?
- LIVE EVENT: 5/7, The End of Data Protection As We Know It. Introducing a Next Generation Data Protection Architecture. Traditional backup is going away, but where does this leave end-users?
- Make or Break: New Auto Products Must Go To Market On Time This Webcast quantifies the value of time to market for the auto industry and highlights how Primavera Enterprise Portfolio Management can help organizations. All Data Storage White Papers | Webcasts