In era of sequestration, data storage optimization key for government agencies

Today, many government agencies – civilian and defense – find themselves in a technology quandary: the volume of data that must be stored is growing rapidly, while shrinking budgets are limiting capital expenditures (i.e. – servers, storage devices, etc.) required to store all of this data.

Government agencies are not only eyeing existing storage demands, but anticipated storage requirements as well. Gartner estimates the external controller based (ECB) disk storage market will grow from $22.2 billion in 2012 to $31.1 billion in 2016 (a compound annual growth rate of 7.9 percent).

As a result, storage optimization becomes critical for agencies seeking to boost IT performance while improving utilization and infrastructure efficiency. For agency decision makers seeking to improve storage efficiency as a way to address growing data volumes and shrinking budgets, there are a handful of key strategies to consider.

Eliminate redundant storage costs

Flat or declining budgets, along with sequestration, mean that capital expenditures on storage need to be carefully controlled, even as the amount of data demanded continues to explode. A significant amount of very expensive storage is spent storing redundant data, full copies, slightly changed virtual machines, or simply pre-provisioned "empty space," all of which represents purely wasted money.

Deduplication, compression and similar technologies can help federal agencies save on storage consumption. A 2011 CTOLabs survey report cites that “high-speed, inline deduplication removes redundant data—before it is saved to disk—resulting in a backup dataset that requires 40 – 60x less disk storage.”

At the same time, agencies can’t optimize storage they cannot see. Federal agencies in the process of data center consolidation can often overlook whole "islands of storage,” and storage analytics tools can be used to find and reclaim these unused assets.

Flash storage evolving to all-flash

Government agencies are experiencing high pain around ever-increasing data storage requirements due to factors including capacity growth and the need for agility, or the high-sequential bandwidth and storage density required by big data.

But for a select, yet growing, number of applications, the true pain point is around performance, where simply scaling out disk drives or even utilizing hybrid solutions just isn't delivering the mission-critical sub-millisecond response times necessary. All flash-array storage, which can deliver enhanced performance by leveraging solid state drives (SSD) with flash memory drives rather than hard disk drives, is being applied more broadly in the public sector in 2013 as it becomes more clear that all-flash array storage is optimally suited for government agencies where reliability, scalability and non-disruptive operations are paramount.

Hybrid storage arrays still valuable for storage optimization

While all-flash storage is a critical new tool in storage infrastructure, it won't replace traditional and hybrid storage arrays overnight. Hybrid storage arrays still hold appeal for agencies as they provide flexibility and choice to meet the right performance, capacity, and cost considerations for any federal storage workload.

Balance storage optimization with non-disruptive operational needs

Today’s leading storage arrays can deliver improved optimization without any disruptive impact on operations. For agencies, the ability to leverage single storage arrays that are able to provide high input-output operations per second (IOPS) with sub-millisecond latency equates to consistent high-performance with the lowest dollar cost per IOPS.

Flash arrays also support the federal government’s initiatives to consolidate data centers and reduce power requirements, allowing federal agencies to meet performance requirements while taking up less space, power and cooling.

Understand strengths, weaknesses of Hadoop

The volume, velocity and variety of data are making decision support a greater challenge than ever before. The fact that 80-90% of the data being consumed by organizations is now unstructured data exacerbates the problem: using traditional, structured databases to process this information contained in photos, videos, social media posts, emails and text documents is ineffective. 

Some organizations are beginning to leverage databases that run on the Apache™ Hadoop® open source framework. The challenge with Hadoop is that it is an emerging technology requiring custom coding that also relies on commodity hardware, which can result in reduced physical efficiencies. Government agencies evaluating Hadoop must ensure these solutions (1) protect against the effects of disk failures;  (2) eliminate the need for duplicate copies of data for data protection, driving utilization up; (3) provide redundancy for the name nodes, now a single point-of-failure; and (4) allow the sharing of data using logical, rather than physical copies.

It is increasingly clear that data storage – and improving storage optimization – will play a critical role for agencies. For many organizations, data storage is the biggest IT expense, which means that every agency needs new strategies for optimizing storage to meet current and future data demands.

Copyright © 2013 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon