Life with a large cloud: Lessons learned

Integration, security, service levels can be problematic

Many companies are just getting started with cloud computing, but others are already well entrenched. Indeed, some organizations in the latter camp have built massive private clouds that support significant portions of their operations.

Those giant megaclouds create unique management challenges for CIOs and business leaders. Among other things, administrators need to maintain service levels, ensure that systems hosted in the cloud are secure, and position the new offerings in a way that makes internal customers want to use them. And because so much about cloud computing is relatively new, organizations must learn to deal with hurdles like those while they're in the process of deploying these multi-petabyte IT architectures.

"To me, the biggest challenge in implementing private clouds is the massive culture and operating model shift from a 'do it for the users' model to a user self-service model," says Frank Gens, an analyst at research firm IDC in Framingham, Mass. "The entire IT service delivery model -- from design through deployment and operation, and on through support -- needs to be overhauled."

Here's a look at some of the hurdles associated with big clouds that organizations are dealing with as they implement and use service-based computing infrastructures.

Integration with legacy systems

Enterprises aren't moving to all-cloud environments overnight, so the integration of private clouds and existing IT systems is a key issue.

BAE Systems, an Arlington, Va.-based defense and security company, operates a multi-tenant private cloud on behalf of its government and military customers. The system encompasses multiple petabytes of storage -- though BAE declined to specify exactly how large its cloud is.

BAE also uses a smaller-scale private cloud internally to develop and test systems that it's building for customers, says Jordan Becker, a BAE vice president.

Jordan Becker
The cloud infrastructure that BAE Systems is building will gradually replace data centers that BAE and its customers currently operate, making integration between older and newer systems a key challenge, says Jordan Becker, a BAE vice president.

The cloud infrastructure that BAE is building will gradually replace data centers that the company and its customers currently operate. Therefore, integration and migration between the older and newer computing environments is an issue BAE has had to address.

"The private cloud deployment is relatively new and has not yet displaced the existing data centers," Becker says. However, the private cloud has helped "slow the growth of the legacy data centers," he explains. "As the legacy data center infrastructure approaches its natural capital refresh cycle, the infrastructure will be displaced incrementally with new cloud infrastructure. This process will take several years."

During this transition, "we need to elastically extend that legacy data center to enable the applications already running to scale across the cloud transparently," Becker says. "It should look to the user as though it's one virtual infrastructure" from both an applications and management standpoint, he explains.

To achieve that integration, BAE Systems has created a common global namespace -- a heterogeneous, enterprisewide abstraction of all file information -- for all of its image data. The data includes two-dimensional images, files with stereo sound and files that include full-motion video. There is also metadata that goes along with these images, Becker says.

"The global common namespace is unique to each particular customer group," he says. "One such customer group that shares a common namespace [is made up of] users of geospatial information across several defense and intelligence agencies."

Now that BAE employs this common namespace for the private cloud that reaches across legacy data centers and file archives originally developed for stand-alone applications, customers can access and federate information with peers they want to collaborate with in a seamless manner, Becker says.

Security and service continuity

The University of Southern California operates a 4-petabyte private cloud that supports the USC Digital Repository. The IT professionals responsible for it have found that cloud security is one of their biggest concerns.

The Digital Repository provides clients with digital archives of content such as high-definition videos and high-resolution photos. Services include converting physical or electronic collections to standard digital formats for preservation and online access. The Repository also features high-bandwidth file management capabilities for accessing, managing and manipulating the large digital collections.

1 2 3 4 Page
From CIO: 8 Free Online Courses to Grow Your Tech Skills
Join the discussion
Be the first to comment on this article. Our Commenting Policies