Skip the navigation

World's data will grow by 50X in next decade, IDC study predicts

IT execs will likely have trouble finding enough people with the skills and experience to manage it, analysts say

June 28, 2011 01:23 PM ET

Computerworld - In 2011 alone, 1.8 zettabytes (or 1.8 trillion gigabytes) of data will be created, the equivalent to every U.S. citizen writing 3 tweets per minute for 26,976 years. And over the next decade, the number of servers managing the world's data stores will grow by ten times.

Those are some of the findings in the fifth annual IDC Digital Universe study that was released today.

Interestingly, the amount of data people create by writing email messages, taking photos, and downloading music and movies is minuscule compared to the amount of data being created about them, the EMC-sponsored study found.

The IDC study predicts that overall data will grow by 50 times by 2020, driven in large part by more embedded systems such as sensors in clothing, medical devices and structures like buildings and bridges.

IDC chart
IDC Digital Universe study.

The study also determined that unstructured information - such as files, email and video - will account for 90% of all data created over the next decade.

The bad news: the number of IT professionals available to manage all that data will only grow by 1.5 times today's levels, IDC said.

The number of people with the skills and experience to manage the fast-growing stores of corporate simply isn't keeping pace with demand, IDC noted.

The study also notes that data security will continue to be a key issue for IT managers.

For example, though 75% of data today is generated by individuals, enterprises will have some liability for 80% of it at some point in its digital life. And less than one-third of all stored data today has even minimal security or protection; only about half the information that should be protected is protected at all, IDC stated.

The good news: new hardware and software technologies have driven the cost of creating, capturing, managing and storing information down to one-sixth of what it was in 2005.

For example, data deduplication and compression technologies have reduced the amount of data transmitted across networks and stored in data centers, while virtualization and thin provisioning (allocating just enough disk array capacity to store data) have increased storage system utilization rates.

"As an industry, we've done a tremendous job at lowering the cost of storing data. As a result, people and companies store more data," said David Reinsel, IDC's vice president of storage and semiconductor research.

Since 2005 annual investments by enterprises in hardware, software and cloud services technologies, along with the staff to manage information, has increased 50% to $4 trillion.

New data capture, search, discovery, and analysis tools will also create data about data automatically, much like facial recognition routines that help tag Facebook photos. Data about data, or metadata, is growing twice as fast as the digital universe as a whole. A gigabyte of stored data can use as much as a petabyte of information, according to Reinsel.

Our Commenting Policies