In my last post, we discussed some of the concerns companies and government agencies should look into before moving toward cloud computing. Now, let's dig a little deeper so we can determine whether cloud computing makes sense for your organization.
First, you'll need to do an internal assessment. Gather the information using the following questions:
1) Are your applications custom and therefore difficult to learn and support, or are they off-the-shelf applications?
2) Will you eliminate your data centers in favor of outsourcing everything, or will you keep some internal services?
3) If you decide to outsource applications, would there be enough network bandwidth to assure good performance? (This may not be an issue in many cases, but it depends on the application.)
4) Is your existing security, data protection, availability, and recoverability better or worse than the cloud providers?
5) Can you provide the same services internally with the same service levels at lower costs?
The last one question is really the first one you should consider. In fact, you should do an internal study to determine your current operating costs. Then speak with your vendors and put out a request for information (RFI) to find out whether implementing new technology within your current data center may have a more positive impact on existing costs while allowing you to keep full control of all your data.
For instance, by getting rid of backup, implementing dedupe, and introducing continuous data protection, you can reduce storage and WAN costs, providing fast recovery services via snapshots. Furthermore, server and storage consolidation via storage virtualization and server virtualization can lower overall costs by up to 80 percent for both primary and disaster recovery (DR) data centers.
Your goal should be to get your internal costs down to the lowest price per unit you can, and then shop around to see if cloud providers can beat your costs. As an example, if you're currently doing traditional tape backup and shipping tapes offsite for DR, a cloud provider would most likely be able to provide a better service level agreement (SLA) at a lower cost. But if you implement a new disk-based dedupe solution that efficiently replicates data for fast disk recovery for DR, your internal costs could be lower than the cloud provider's quote.
As an example, let's say you are doing chargeback to your internal IT users for the services provided by the IT staff, and you are charging them $1 per GB for tape backup, which includes local backup and a tape copy of the data offsite. The $1 per GB includes the fully burdened cost to provide that service, which includes the tape media, operations, tape library, maintenance, offsite tape storage, etc.
The service level for recovery point is 24 hours.
- The service level for local recovery of a lost file is four to eight hours (the tape may be offsite).
- The service level for DR is seven days or best effort (your server is one of many that will need to be recovered).
Now, let's say a cloud provider comes along and says it will manage the process and provide the same services for only 75 cents per GB. That's a savings of 25 percent over your existing costs.
This all looks great to the chief financial officer (CFO), who wants to save money. But being concerned about data security, you also put out an RFI for backup and DR enhancement solutions to see if you can also improve your costs. You may find that some vendors can implement a better solution at lower cost than the cloud provider.
The vendor will implement a CDP solution with dedupe and encryption for 50 cents per GB.
The service level for recovery point is one hour.
- The service level for local recovery of a lost file is 15 minutes.
- The service level for DR is four to eight hours.
The improvement in service levels would actually justify a much higher cost over the current tape-based solution, as the recovery objectives are more in line with T1 applications that typically use much more expensive array-based recovery solutions.
The cloud may or may not make sense for you. Just make sure you do your homework and gather your internal costs to make sure if you do pull the trigger, you're not going to regret it later.
Sidenote: I put together a simple flowchart during the time I acted as a deputy commissioner for the TechAmerica Cloud2 commission. It may be helpful for commercial companies and government agencies alike to figure out what applications make sense to put in the cloud, what cloud partners to choose based on the maturity and robustness of the offering (solution provider classification), and when to go public or just create your own internal private cloud and do it yourself.
In any case, just following the NIST cloud guidelines may be helpful, or at least get you through the fog! Happy clouding!
Christopher Poelker is the author of Storage Area Networks for Dummies, the vice president of enterprise solutions at FalconStor Software, and former deputy commissioner of the TechAmerica Foundation Commission on the Leadership Opportunity in U.S. Deployment of the Cloud (CLOUD²).