Skip the navigation

27 billion gigabytes to be archived by 2010

IT executives clamor for ways to prune and centralize their mushrooming data stores.

By Jennifer McAdams
December 31, 2007 12:00 PM ET

Computerworld - In 2008, almost every sector will continue the battle with data overload. Entertainment powerhouses - from television stations to big-name amusement parks - will struggle to house huge media files or to manage the data necessary to track customer spending trends. Universities will need extra capacity to spur e-learning and to hold more detailed data on students. Hospitals will cling to enhanced storage projects to avoid buckling under onerous regulations and the prospect of storing massive image files.

These are just a smattering of scenarios that point to a now-staggering need for space. In fact, respondents to Computerworld's most recent Vital Signs survey ranked storage-related initiatives as their No. 2 project priority this year, up from No. 4 last year.

According to Milford, Mass.-based analyst firm Enterprise Strategy Group Inc., private-sector archive capacity will hit an eye-popping 27,000 petabytes by 2010. Skyrocketing rates of e-mail growth account for much of this figure.

For instance, the University of Pittsburgh now pegs monthly e-mail traffic at more than 30 million messages, vs. 17 million just one year ago.

Other new factors driving the need for capacity include the pervasiveness of large files, be they media-rich elements or specialized program data such as the computer-aided design drawings now used in building everything from cars to furniture. Cloned copies of the same information are also bogging down many corporate networks.

Ironically, the adoption of virtualization technology - billed as a way to centralize and simplify storage strategies - can also trigger an initial spike in data capacity demands.

Trim the Fat

To combat spiraling data overload, corporate IT leaders will scour the market for ways to centralize storage and they will pursue options such as clustered architectures and unified storage-area networks (SAN). Data-pruning techniques, including the use of thin provisioning and data de-duplication tools, will also be high on 2008 corporate storage wish lists, according to Forrester Research Inc. analyst Andrew Reichman.

Mounting interest in these approaches highlights a pronounced shift away from "big-iron storage" - traditional storage arrays typically composed of custom application-specific integrated circuits, RAID controllers, and fixed-disk and cache-scalability ceilings.

"The alternative is software-focused solutions that make more use of general-purpose hardware and advanced software," Reichman says.



Our Commenting Policies