Explosive growth expected for Hadoop, MapReduce-related revenues: IDC
The analyst firm predicts banner days ahead for the 'big data' programming frameworks
IDG News Service - The market for software related to the Hadoop and MapReduce programming frameworks for large-scale data analysis will jump from $77 million in 2011 to $812.8 million in 2016, a compound annual growth rate of 60.2%, according to a new report released Monday by analyst firm IDC.
Hadoop is an open-source implementation of the MapReduce framework. It is hosted at the Apache Software Foundation along with a number of supporting software projects, including the Hadoop Distributed File System (HDFS) and Pig programming language.
MapReduce and Hadoop are based on the principle of splitting up large amounts of data and then processing the chunks in parallel across large numbers of nodes. It's closely associated with the industry buzzword "big data," which refers to the ever-larger volumes of information, particularly of unstructured form, being generated by websites, social media, sensors and other sources.
Overall, Hadoop has enjoyed a steady stream of interest from commercial analytics and database vendors in recent years, who have begun offering commercial products and services for it.
While "fantastic and largely unsupportable claims have been made" regarding Hadoop and MapReduce's use cases and benefits, "there can be no doubt that it does provide a relatively low-cost means of deriving considerable value from very large collections of unorganized data," IDC analysts Carl Olofson and Dan Vesset wrote in the report.
Therefore, the conditions are right for significant growth in the Hadoop-MapReduce "ecosystem," according to IDC.
This year, "Leading adopters in the mainstream IT world will move from 'proof of concept' to real value," the report states.
However, lack of qualified talent will limit the technology's rise during the next two to three years, it adds.
The coming years will also see a "battle between open source purists, who believe that the core of Hadoop deployment must be based purely on the Apache project code," according to IDC. However, most IT organizations will use a mix of commercial and open-source components in their Hadoop environments, the report adds.
Still, "competition between open source vendors and their closed source counterparts may force lower license fees from the latter group, resulting in somewhat slower software revenue growth than would be the case if open source projects did not represent so large a component of this market space."
IDC is a subsidiary of IDG News Service's parent company, International Data Group.
Chris Kanaracus covers enterprise software and general technology breaking news for The IDG News Service. Chris's e-mail address is Chris_Kanaracus@idg.com
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- The value of smarter oil and gas fields With global energy requirements continuing to rise, the exploration, development and production of new oil and gas resources are shifting to increasingly challenging...
- Smarter Environmental Analytics Solutions: Offshore Oil and Gas Installations Example This IBM Redbooks® Solution Guide describes a solution for implementing smarter environmental monitoring and analytics for oil and gas industries. The solution implements...
- Piecing Together the Business Intelligence Puzzle Business intelligence (BI) technology collects and analyzes company data, delivering relevant information to corporate decision-makers in an effort to produce favorable outcomes.
- Harness IT -- An Introduction to Business Intelligence Solutions Learn the key selection criteria required to provide your organization with the capability to address structured data, unstructured data and mobile demands so...
- Live Webcast Increasing the Value of Your Reports and Dashboards Learn how incorporating other analytical capabilities such as predictive modeling and visualization can increase the value of your reports and dashboards by providing...
- The Software-Defined Data Center: Is your ADC ready? Data center transformation is accelerating beyond virtualization to next-generation cloud architectures and software-defined data centers, bringing new challenges for application performance, scalability and...
- Application Acceleration: Optimize the End-User Experience Watch this on-demand webcast and learn how you can optimize your web content, accelerate performance across any device and browser combination, and offload... All Business Intelligence/Analytics White Papers | Webcasts