Hadoop gets real

Robust data processing and storage capabilities make Hadoop both wildly popular and wildly complex. Here's how four IT leaders managed to bring Hadoop systems from the sandbox into production.

1 2 3 4 Page 4
Page 4 of 4

Simplification Cuts Out the Middleman

Even the most unconventional proofs of concept are becoming easier to perform, thanks to the simplification of the Hadoop ecosystem. Children's Healthcare of Atlanta, a pediatric hospital, began testing Cloudera's Hadoop distribution in the summer of 2013. Unlike many organizations that turn to Hadoop, Children's Healthcare has a modest data store of only 2.5TB, but it's growing at a rate of 75GB per week. Yet, in a partnership with the Georgia Institute of Technology, the hospital wanted to examine how light, sound and alarm data from patients' bedside monitoring equipment impacts patients' physiology. Other projects include analyzing how medical procedures impact patients' vital signs and outcomes.

According to Tod Davis, manager of business intelligence and data warehousing at Children's Healthcare, performing a proof of concept on Hadoop was relatively pain-free. "We spent $600 on six 1TB drives, we had some workstations that were about to be thrown away because of a hardware refresh, and we had a weekend," he says. The cluster is nicknamed Frankendoop "because it was built from scavenged parts of other desktop workstations bought with a personal credit card," says Davis, adding that, in just four weeks, "we had an operational tool that proved our concept would work."

Migrating from a sandbox environment to production was just as simple. Children's Healthcare purchased eight Hewlett-Packard servers and licenses from Cloudera, tested the system for uptime, and validated its performance for several months before releasing it to production. All of which Davis did without the help of high-priced data scientists. Instead, Davis says he "committed every minute of every day to learning this new technology and all the different pieces of the Hadoop ecosystem."

From Simplifying Hadoop to Organizing Humans

App galleries, new software tools and fresh approaches to testing Hadoop are key to its simplification. But "the biggest adoption challenge we see is organizational," says Ron Bodkin, founder of integration services and big data consultancy Think Big Analytics. "It's about getting people together to actually apply the right technology to solve a business problem."

Creating the position of a chief data officer, says Bodkin, places a single individual in charge of managing Hadoop rather than leaving it to a group of programmers with competing interests. A center of excellence team can also bring about important organizational changes by uniting business analysts with IT professionals, thereby encouraging greater collaboration and enterprise alignment.

Not all companies would be willing to revamp their organizational structures to accommodate Hadoop. Yet the more security capabilities, business intelligence tools and management layers vendors such as Hortonworks, Cloudera and MapR bake into their Hadoop distributions, the closer enterprises come to embracing Hadoop without worrying about security or the challenge of finding high-priced talent.

"Right now, many organizations really need subject-matter experts in Hadoop," says Smith. "But as companies build applications that make it easier to use Hadoop, it'll drive enterprise adoption. That's the model of the future."

Waxer is a Toronto-based freelance journalist. She has written articles for various publications and news sites, including The Economist, MIT Technology Review and CNNMoney.com.

Copyright © 2014 IDG Communications, Inc.

1 2 3 4 Page 4
Page 4 of 4
Bing’s AI chatbot came to work for me. I had to fire it.
Shop Tech Products at Amazon