Skip the navigation

A tale of two U.S. government data center projects

One project, for NOAA, came in on time; the other, for the Air Force, didn't

September 19, 2011 06:01 AM ET

Computerworld - ORLANDO -- This is a story of two federal government data center projects: One, undertaken by the National Oceanic and Atmospheric Administration (NOAA), met its schedule and budget. The other, a U.S. Air Force initiative, went over budget and was late.

How did this happen? What was different about the projects and the approaches that led to different outcomes?

The NOAA project

The NOAA project was a $27.6 million renovation of an approximately five-year-old office and laboratory building in an office park in Fairmount, W.Va. The building was newly leased by the government. The goal of the renovation was to upgrade the capabilities of the computer systems that support NOAA research and to consolidate several research data centers. The agency picked West Virginia because it was looking for locations in a 120-mile radius away from an existing data center.

The entire facility is 54,000 square feet, with 16,000 square feet of raised floor space.

The data center, which opens next month, will house a 29,000-core Intel Xeon supercomputer built by SGI that's capable of reaching speeds of up to 383 teraflops.

The facility wasn't originally intended to house a data center, but NOAA estimates that building a new data center from scratch would have cost approximately double the amount budgeted for the project.

The Air Force project

The Air Force Research Laboratory's Defense Supercomputing Center needed to upgrade mechanical and electrical systems in a 40-year-old data center to accommodate a new supercomputer. Among other things, this nearly 83,000-square-foot building needed an increase in its electric load capacity from 3.3 megawatts to 8MW. It also needed new water-cooled chillers capable of handling a Cray supercomputer with 30, 45kW racks. It has 26,600 square feet of raised floor space.

The project was expected to be completed in June 2010 at a cost of $5.1 million. It wasn't finished until this month, 14 months late, and it was about $1 million over budget.

What went wrong?

The Air Force project's problems, outlined by the project managers during a presentation last week at the Afcom data center conference here, were wide-ranging -- so much so that one person in the audience asked if the project was cursed.

For instance, even though the site was surveyed twice in an effort to discover all buried utility conduits, a TV cable went undetected. And during construction, a previously unknown spring was discovered, causing delays. The list goes on: Three breakers kept failing because of moisture that poor sealing couldn't keep out; the mechanical room needed more work than originally expected to support new equipment; and a worker fell off a ladder and broke both wrists.

Some of those problems could have been addressed quickly, but in many cases they weren't. That's because the Air Force used a design-bid-build contract for the project, meaning it had separate contractors for design and construction, instead of a design-build contract, where one contractor is responsible for both.

In a previous $5 million facility upgrade, the Air Force had used a design-build contract through the Army Corps of Engineers. While that project was a success, the Army Corps of Engineers charged an 8% premium to manage it. This time around, the Air Force wanted to cut costs. It saw the design-bid-build route as a less-expensive option.

One throat to choke

But what the Air Force lost by using a design-bid-build contract was the proverbial "one throat to choke." As problems arose, new requests for information were issued, and multiple teams had to negotiate fixes -- adding months of delays. The Air Force had to work with contracting officers, financial managers, engineers, project managers and others, all of whom were being pulled by other priorities.

"It can be a very tedious process," said Brian Schafer, the infrastructure management chief for the Air Force's supercomputing center. "You can introduce delays that you really didn't anticipate."

With a design-build, there's a single point of contact, someone who knows the project and "can do that change quickly," said Schafer. "I think that was our main issue with this project."

The NOAA project: What went right?

While the Air Force was dealing with an aging facility and some old mechanical systems that presented some unique challenges, NOAA had a newer property to work with in an industrial park designated for high-tech tenants.

Unlike the Air Force, NOAA opted for a design-build contract, with the contractor acting as designer. NOAA had a conceptual design in advance, which it then used in its solicitation.

Darren Smith, the project manager who works in the office of NOAA's CIO, said the U.S. General Services Administration, which manages federal properties, favored the design-build option because it had a very tight budget. "It meant that we got a fixed, firm price for the whole thing," he said.

One drawback, said Smith, is that NOAA couldn't be very specific about the type of equipment used and where it is located. But the NOAA project was completed in a year, on schedule and within its budget.

Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at Twitter @DCgov, or subscribe to Patrick's RSS feed Thibodeau RSS. His email address is pthibodeau@computerworld.com.

Read more about Data Center in Computerworld's Data Center Topic Center.



Our Commenting Policies
Internet of Things: Get the latest!
Internet of Things

Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!