IT must prepare for Hadoop security issues
Aggregating data from multiple sources can cause access control and data entitlement problems, analysts say
Computerworld - NEW YORK -- Corporate IT executives need to pay attention to numerous potential security issues before using Hadoop to aggregate data from multiple, disparate sources, analysts and IT executives said at the Hadoop World conference here this week.
Open source Hadoop technology lets companies collect, aggregate, share and analyze huge volumes of structured and unstructured data from enterprise data stores as well as from weblogs, online transactions and social media interactions.
A growing number of firms are using Hadoop and related technologies such as Hive, Pig and Hbase to analysis analyze data in ways that cannot easily or affordably be done using traditional relational database technologies.
JPMorgan Chase, for instance, is using Hadoop to improve fraud detection, IT risk management, and self service applications. The financial services firm is also using the technology to enable a far more comprehensive view of its customers than was possible previously, executives said.
Meanwhile, Ebay is using Hadoop technology and the Hbase open source database to build a new search engine for its auction site.
Analysts said that IT operations using Hadoop technology for such applications must be aware of potential security problems.
Using the technology to aggregate and store data from multiple sources can create a whole slew of problems related to access control and management as well as data entitlement and ownership, said Larry Feinsmith, managing director of JPMorgan Chase's IT operation.
Hadoop environments can include data of mixed classifications and security sensitivities, said Richard Clayton, a software engineer with Berico Technologies, an IT services contractor for federal agencies.
The challenge for enterprises is to ensure that they implement appropriate security controls for enforcing role-based access to the data, he added.
Aggregating data into one environment also increases the risk of data theft and accidental disclosure, Clayton said.
And importantly, he noted, applications analyzing merged data in a Hadoop environment can result in the creation of new datasets that may also need to be protected.
Many government agencies are putting Hadoop-stored data into separate 'enclaves' to ensure that it can be accessed by those with clearance to view it, Clayton said.
Several agencies won't put sensitive data into Hadoop databases because of data access concerns, Clayton said. Several agencies are simply building firewalls to protect Hadoop environments, he added.
For many Hadoop users, the most effective security approach is to encrypt data at the individual record level, while it is in transit or being stored in a Hadoop environment, Clayton said.
In general, Clayton advises enterprises is to be cautious in their use of Hadoop technologies. He notes that built-in Hadoop Distributed File System (HDFS) security features such as Access Control Lists and Kerberos used alone are not adequate for enterprise needs.
"Security and access control are part of the reason why Hadoop is not ready to replace relational databases" in the enterprise, said David Menninger, an analyst with Ventana Research.
Sid Probstein, the chief technology officer at Attivio, a vendor of unified access management technologies in big data environments, added that "Hadoop is a great technology but there is a whole series of enterprise readiness problems."
Jaikumar Vijayan covers data security and privacy issues, financial services security and e-voting for Computerworld. Follow Jaikumar on Twitter at @jaivijayan, or subscribe to Jaikumar's RSS feed . His e-mail address is email@example.com.
Read more about Business Intelligence/Analytics in Computerworld's Business Intelligence/Analytics Topic Center.
- 15 Non-Certified IT Skills Growing in Demand
- How 19 Tech Titans Target Healthcare
- Twitter Suffering From Growing Pains (and Facebook Comparisons)
- Agile Comes to Data Integration
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Who's afraid of the big (data) bad wolf? Survive the big data storm by getting ahead of integration and governance functional requirements This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
- Understanding big data so you can act with confidence Automating information integration and governance and employing it at the point of data creation helps organizations boost confidence in their big data.
- Integrating and Governing Big Data The end-to-end information integration capabilities of IBM® InfoSphere® Information Server are designed to help organizations understand, cleanse, monitor, transform and deliver data-as well...
- The MDM advantage: Creating insight from big data To help enterprises create trusted insight as the volume, velocity and variety of data continue to explode, IBM offers several solutions designed to...
- Live Webcast Increasing the Value of Your Reports and Dashboards Learn how incorporating other analytical capabilities such as predictive modeling and visualization can increase the value of your reports and dashboards by providing...
- The Software-Defined Data Center: Is your ADC ready? Data center transformation is accelerating beyond virtualization to next-generation cloud architectures and software-defined data centers, bringing new challenges for application performance, scalability and...
- Application Acceleration: Optimize the End-User Experience Watch this on-demand webcast and learn how you can optimize your web content, accelerate performance across any device and browser combination, and offload... All Business Intelligence/Analytics White Papers | Webcasts