27 billion gigabytes to be archived by 2010
IT executives clamor for ways to prune and centralize their mushrooming data stores.
Computerworld - In 2008, almost every sector will continue the battle with data overload. Entertainment powerhouses - from television stations to big-name amusement parks - will struggle to house huge media files or to manage the data necessary to track customer spending trends. Universities will need extra capacity to spur e-learning and to hold more detailed data on students. Hospitals will cling to enhanced storage projects to avoid buckling under onerous regulations and the prospect of storing massive image files.
These are just a smattering of scenarios that point to a now-staggering need for space. In fact, respondents to Computerworld's most recent Vital Signs survey ranked storage-related initiatives as their No. 2 project priority this year, up from No. 4 last year.
According to Milford, Mass.-based analyst firm Enterprise Strategy Group Inc., private-sector archive capacity will hit an eye-popping 27,000 petabytes by 2010. Skyrocketing rates of e-mail growth account for much of this figure.
- Google's Next Moves
- 8 Sizzling Technologies
- 8 Hottest IT Skills
- 8 Ways to Boost Your Career
- Personal Tech 2008
- 8 Juicy Predictions
- 8 Hot-Button Issues to Watch
Other new factors driving the need for capacity include the pervasiveness of large files, be they media-rich elements or specialized program data such as the computer-aided design drawings now used in building everything from cars to furniture. Cloned copies of the same information are also bogging down many corporate networks.
Ironically, the adoption of virtualization technology - billed as a way to centralize and simplify storage strategies - can also trigger an initial spike in data capacity demands.
Trim the Fat
To combat spiraling data overload, corporate IT leaders will scour the market for ways to centralize storage and they will pursue options such as clustered architectures and unified storage-area networks (SAN). Data-pruning techniques, including the use of thin provisioning and data de-duplication tools, will also be high on 2008 corporate storage wish lists, according to Forrester Research Inc. analyst Andrew Reichman.
Mounting interest in these approaches highlights a pronounced shift away from "big-iron storage" - traditional storage arrays typically composed of custom application-specific integrated circuits, RAID controllers, and fixed-disk and cache-scalability ceilings.
"The alternative is software-focused solutions that make more use of general-purpose hardware and advanced software," Reichman says.
- What is this "File Sync" Thing and Why Should I Care About It? All of a sudden, getting a file from your work laptop to your iPad became as simple as clicking "Save." So it's no...
- The Keys to Securing Data in a Collaborative Workplace Losing data is costly. IT professionals have spent years learning how to protect their organizations from hackers, but how do you ward off...
- Cloud-to-Cloud Backup Case Study: AMAG Pharmaceuticals As an IT pioneer in the pharmaceuticals industry, AMAG realized that SaaS backup and recovery would give them the confidence to fully embrace...
- 9 Essentials for a Complete Cloud-to-Cloud Backup Solution In 9 Essentials for a Complete Cloud-to-Cloud Backup Solution, we'll walk you through potential sources of data loss in the cloud and provide...
- The Key to Happiness: Throw out Your Data Warehouse In this webinar, Kerry Reitnauer, Director, Solution Architect at FairPoint Communications will discuss the challenges the data warehouse brought, how they migrated to...
- The Foundation You Need to Build a Better Storage Infrastructure Watch this webcast to hear how you can maximize the economics of your data center by modifying your storage footprint and power usage... All Data Storage White Papers | Webcasts