Extending the life of your data center
Bumping up against the life expectancy of your data center? Use these best practices to delay a costly expansion.
Computerworld - This year marks the 10th anniversary of the 1,200-square-foot data center at the Franklin W. Olin College of Engineering -- that means the facility has been operating three years longer than CIO and vice president of operations Joanne Kossuth had originally planned. Now, even though the school needs a facility with more capacity and better connectivity, Kossuth has been forced to back-burner the issue because of the iffy economic times.
"Demand has certainly increased over the years, pushing the data center to its limits, but the recession has tabled revamp discussions," she says.
Like many of her peers, including leaders at Citigroup and Marriott International, Kossuth has had to get creative to eke more out of servers, storage and the facility itself. To do so, she's had to re-examine the life cycle of data and applications, storage array layouts, rack architectures, server utilization, orphaned devices and more.
Rakesh Kumar, research vice president at Gartner, says he's been bombarded by large organizations looking for ways to avoid the cost of a data center upgrade, expansion or relocation. "Any data center investment costs at minimum tens of millions, if not hundreds of millions, of dollars. With a typical data center refresh rate of five to 10 years, that's a lot of money, so companies are looking for alternatives," he says.
While that outlook might seem gloomy, Kumar finds that many companies can extract an extra two to five years from their data center by employing a combination of strategies, including consolidating and rationalizing hardware and software usage; rolling out virtualization; and physically moving IT equipment around. Most companies don't optimize the components of their data center and, therefore, bump up against its limitations faster than necessary, he says.
Here are some strategies that IT leaders and other experts suggest to push data centers farther.
Relocate noncritical data. One of the first areas that drew the attention of Olin College's Kossuth was the cost of dealing with data. As one example, alumni, admissions staff and other groups take multiple CDs worth of high-resolution photos at every event. They use server, storage and bandwidth resources to edit, share and retain those large images over long periods of time.
To free the data center from dealing with the almost 10 terabytes of data those photos require, Kossuth opened a corporate account on Flickr and moved all processes surrounding management of those photos over there. Not only did it save her the cost of a $40,000 storage array she would have had to purchase, but also alleviated the pressure on the data center from the resource-intensive activity associated with high-resolution images.
"There is little risk in moving non-core data out of the data center, and now we have storage space for mission-critical projects," Kossuth says.
Take the pressure off of high-value applications and infrastructure. Early on, Olin College purchased an $80,000 Tandberg videoconferencing system and supporting storage array. Rather than exhausting that investment from overuse, Kossuth now prioritizes video capture and distribution, shifting lower-priority projects to less expensive videoconferencing solutions and YouTube for storage.
For example, most public relations videos are generated outside of the Tandberg system and are posted on the college's YouTube channel. "The data center no longer has to supply dedicated bandwidth for streaming and dedicated hardware for retention," she says. More importantly, the Tandberg system is kept pristine for high-profile conferences and mission-critical distance learning.
- Hadoop for Dummies Today, organizations in every industry are being showered with imposing quantities of new information. Along with traditional sources, many more data channels and...
- The Top Five Ways to Get Started with Big Data Despite the increased focus on big data over the past few years, most organizations are still talking about what big data is rather...
- Data Warehouse Augmentation: The Queryable Data Store While organizations have, to date, been busy exploring and experimenting, they are now beginning to focus on using big data technologies to solve...
- The IBM Big Data Platform IBM is unique in having developed an enterprise class big data platform that allows you to address the full spectrum of big data...
- Live Webcast Best Practices: How to Improve Business Continuity with Virtualization VMware solutions include a range of business continuity capabilities to help ensure availability for applications across your virtualized environment. Learn More>>
- Endpoint Data Management: Protecting the Perimeter of the Internet of Things Not surprisingly, "Internet of Things" (IoT) and Big Data present new challenges AND opportunities for enterprise IT. Teams need to harness, secure and...
- Best Practices: How to Improve Business Continuity with Virtualization VMware solutions include a range of business continuity capabilities to help ensure availability for applications across your virtualized environment. Learn More>> All Data Center White Papers | Webcasts