The news that AT&T Inc. has joined the rapidly growing ranks of cloud computing providers reinforces the argument that the latest IT outsourcing model is well on its way to becoming a classic disruptive technology.
By enabling data center operators to "publish" computing resources -- such as servers, storage and network connectivity -- cloud computing provides a pay-by-consumption scalable service that's usually free of long-term contracts and is typically application- and operating system-independent. The approach also eliminates the need to install any on-site hardware or software.
Currently dominated by Amazon.com Inc. and several small start-ups, cloud computing is increasingly attracting the interest of industry giants, including Google, IBM and now AT&T. "Everyone and their dog will be in cloud computing next year," predicts Rebecca Wettemann, an analyst at Nucleus Research, a technology research firm.
Yet James Staten, an analyst at Forrester Research Inc., warns that prospective adopters need to tread carefully in a market that he describes as both immature and evolving. Staten notes that service offerings and service levels vary widely among cloud vendors. "Shop around," he advises. "We're already seeing big differences in cloud offerings." To help cut through the confusion, here's a rundown of some major cloud providers -- both current and planned -- all offering resources that go beyond basic services such as software-as-a-service applications and Web hosting:
3Tera Inc.: Appliance-driven virtual servers
3Tera's AppLogic is a grid engine that has evolved over time into a full-fledged cloud computing environment. The company said its offering is designed to enable data centers to replace expensive and hard-to-integrate IT infrastructure -- such as firewalls, load balancers, servers and storage-area networks -- with virtual appliances. Each appliance runs in its own virtual environment.
AppLogic combines servers into a scalable grid that's managed as a single system via a browser or secure shell. According to 3Tera, data centers can add or remove servers on the fly, monitor hardware, manage user credentials, reboot servers, install software, build virtual appliances, back up the system, repair damaged storage volumes, inspect logs and perform every other management task from a single point of control, all while the system is running.
Amazon.com: As-you-need-them basic IT resources
Amazon was an early cloud computing proponent, and the company now has one of the market's longest menu of services. Amazon's core cloud offering, the Elastic Compute Cloud (EC2), provides a virtualized cloud infrastructure that's designed to provide scalable compute, storage and communication facilities.
Amazon's cloud computing arsenal also includes the Simple Storage Service (S3), a persistent storage system; the Simple Database (SimpleDB), which provides a remotely accessible database; and the Simple Queuing Service, a message queue service that's also an agent for tying together distributed applications created by the EC2, S3 and SimpleDB combo.
AT&T: Scalable hosting in a managed network
AT&T Synaptic Hosting aims to give data centers the ability to manage applications, compute resources on servers and store data elastically, so they can scale up or down as needed. The hosted platform provides dynamic security and storage capabilities, as well as a Web portal to manage capacity, conduct maintenance, and monitor network service and performance.
AT&T has long offered hosting services, but not ones that could scale up or down as needed. AT&T's resources and services run within its own network, rather than across data centers linked via the public Internet, which the company claims provides more certainty over server levels.
Google Inc.: Resources for small businesses and developers
Google already offers cloud-based services such as e-mail and storage for consumers, as well as the AppEngine development and provisioning platform for individual developers. The company's logical next step, given its vast infrastructure resources, would be a move into the enterprise cloud market.
"There's not that much difference between the enterprise cloud and the consumer cloud," Google CEO Eric Schmidt said last May during an appearance in Los Angeles with IBM chief Sam Palmisano, as the companies announced a joint cloud computing initiative. Over the next year or so, Google and IBM plan to roll out a worldwide network of servers for a cloud computing infrastructure. The IBM-Google cloud runs on Linux-based machines using Xen virtualization and Apache Hadoop, an open-source implementation of the Google File System. Provisioning is automatic, courtesy of IBM's Tivoli Provisioning Manager.
IBM: A platform for your 'internal' cloud
Aside from its Google venture, IBM is focusing its cloud strategy on "Blue Cloud," a series of offerings that will enable computing across a distributed, globally accessible fabric of resources, rather than on local machines or remote server farms. Built on IBM's massive-scale computing initiatives, Blue Cloud aims to give data centers the ability to establish their own cloud computing architecture to handle the enormous data-processing power required for video, social networking and other Web 2.0 technologies.
Initially, the Blue Cloud technology must be deployed internally at each organization, essentially as the foundation for an "internal" cloud. The Blue Cloud platform, running on IBM BladeCenters with Power and x86 processors and Tivoli service management software, dynamically provisions and allocates resources as workloads fluctuate for an application. Blue Cloud is being billed as a more distributed computing architecture than what's typically found in most enterprise data centers. It is based on Hadoop. Over time, IBM expects to offer Blue Cloud resources on demand, in the provisioned style of Amazon.com and AT&T.
IBM also provides hosting services for SaaS providers, including SAP AG and SuccessFactors.
Sun Microsystems Inc.: An on-demand grid, and perhaps more
With its "the network is the computer" mantra, Sun provided much of the inspiration for the cloud computing movement. And its Sun Grid Engine was one of the first on-demand cloud offerings, providing access to compute and storage resources optimized for parallel-processing applications.
The company also has a research venture dubbed "Project Caroline" meant to provide a configurable pool of virtualized compute, storage and networking resources to small and midsize SaaS providers, so they don't need to develop their own infrastructure. There have been recent reports that Sun is planning to turn Project Caroline into a full-blown business, but there's been no official word from the company yet.
Terremark Worldwide Inc.: Resource pool for on-demand servers
The Terremark Enterprise Cloud is designed to give data centers an Internet-optimized computing infrastructure. Enterprise Cloud clients buy a dedicated resource pool of processing, memory, storage and networking, from which they can deploy servers on demand. A Web portal allows server to be dynamically provisioned from a pre-allocated pool of dedicated computer resources. Terremark promises that its cloud servers behave exactly like their physical counterparts, allowing applications to be run without modification.
XCalibre Communications Ltd.: Self-provisioned virtual servers
Described by some observers as Europe's answer to Amazon's EC2, Scotland-based XCalibre's FlexiScale provides self-provisioning of virtual dedicated servers via a control panel or API. Persistent storage is based on a fully virtualized high-end SAN/NAS back end.
This story, "Who provides what in the cloud" was originally published by InfoWorld.