Cloud has proven to be a sticky marketing term, even (especially?) among non-technical consumers. In 2013, the average consumer began to understand just what the cloud was all about: an immense repository of data and services that are broadly accessible.
For business users, the cloud also became commonplace. Although most executives still don’t have a deep understanding of the choices and tradeoffs involved with a shift to cloud computing, there is a growing consensus that “the cloud” promises a quicker and cheaper way of purchasing IT than the traditional, build-it-yourself model. This in turn has led to internal debates between application developers and infrastructure teams on the best approach for implementing “the cloud” within IT organizations.
(And of course, IT architects know that when they hear management say, “We want to move to the cloud,” they will be the ones who have to figure out how to do it!)
The SPI cloud model is here to stay
In 2013, the SaaS/PaaS/IaaS (SPI) cloud model moved from concept to reality for the majority of organizations. According to North Bridge and GigaOM’s 2013 Future of Cloud Computing, 75 percent of those surveyed reported the use of a least one cloud instance in their environment.
Some cloud implementations remain relatively easy, such as using a SaaS provider to bring online a new, stand-alone application—assuming that there’s minimal integration required. However, most cloud implementations are still difficult: For example, reconfiguring existing on-premise servers, networking, and storage equipment into an infrastructure of pooled IT resources remains an elusive goal for many organizations (aka “building a private cloud”). And some cloud implementations often appear to be darn near impossible, such as building a hybrid cloud that allows easy data migration between on-premise private clouds and off-premise public clouds.
Yet, despite the growing pains, the increased availability of cloud services in 2013 became a much-needed relief valve for IT organizations as the pressure to “do more with less” continued with no end in sight. In 2014, it is safe to assume that enterprises will spend a great deal of time determining which applications are the next candidates for a transition to the cloud.
Three criteria for assessing cloud-based workloads
While 2013 was the year of mainstream cloud adoption, 2014 looks to be the year of cloud immersion, with many companies following the lead of the U.S. Government and adopting a “cloud first” policy. From a data-storage perspective, each transition to cloud-based workloads means an organization must decide whether to use on-premise or off-premise cloud storage. This decision requires an assessment in three areas: performance, control, and cost.
Performance: Despite advances in carrier speeds, data transmission to and from off-premise public cloud providers is still inadequate for most transaction-based applications. Cloud throughput can be spotty and subject to bursts of activity from other tenants sharing your resources. Minimum performance levels can be enforced contractually but are difficult to verify. The penalty paid by the provider also fails to cover the true cost: unhappy users suffering from poor response times. Your best bet may be to make sure off-premise cloud applications (and users) can tolerate inconsistent performance.
Control: When you move any workload off-premise, in essence, you are giving up control of your storage infrastructure for that workload. This means that architectural decisions, performance tuning, maintenance cycles, and technology refreshes will be someone else’s domain. Off-premise storage also means that unauthorized access to your data carries a higher risk. Although cloud security breaches have been rare, this is still a consideration for applications with strict security or compliance requirements.
Cost: Fundamentally, nearly all decisions to move to the cloud involve projected cost savings. A determination of off-premise vs. on-premise storage will be a compromise between the ability of a cloud service provider to successfully emulate your IT environment, protect your data from prying eyes, and positively impact your budget’s bottom line.