IT's biggest project failures -- and what we can learn from them

Think your project's off track and over budget? Learn a lesson or two from the tech sector's most infamous project flameouts.

Every year, the Improbable Research organization hands out Ig Nobel prizes to research projects that "first make people laugh, and then make them think."

For example, this year's Ig Nobel winners, announced last week, include a prize in nutrition to researchers who electronically modified the sound of a potato chip to make it appear crisper and fresher than it really is and a biology prize to researchers who determined that fleas that live on a dog jump higher than fleas that live on a cat. Last year, a team won for studying how sheets become wrinkled.

That got us thinking: Though the Ig Nobels haven't given many awards to information technology (see No Prize for IT for reasons why), the history of information technology is littered with projects that have made people laugh -- if you're the type to find humor in other people's expensive failures. But have they made us think? Maybe not so much. "IT projects have terrible track records. I just don't get why people don't learn," says Mark Kozak-Holland, author of Titanic Lessons for IT Projects (that's Titanic as in the ship, by the way).

When you look at the reasons for project failure, "it's like a top 10 list that just repeats itself over and over again," says Holland, who is also a senior business architect and consultant with HP Services. Feature creep? Insufficient training? Overlooking essential stakeholders? They're all on the list -- time and time again.

A popular management concept these days is "failing forward" -- the idea that it's OK to fail so long as you learn from your failures. In the spirit of that motto and of the Ig Nobel awards, Computerworld presents 11 IT projects that may have "failed" -- in some cases, failed spectacularly -- but from which the people involved were able to draw useful lessons.

You'll notice that many of them are government projects. That's not necessarily because government fails more often than the private sector, but because regulations and oversight make it harder for governments to cover up their mistakes. Private enterprise, on the other hand, is a bit better at making sure fewer people know of its failures.

So here, in chronological order, are Computerworld's favorite IT boondoggles, our own Ig Nobels. Feel free to laugh at them -- but try and learn something too.

IBM's Stretch project

In 1956, a group of computer scientists at IBM set out to build the world's fastest supercomputer. Five years later, they produced the IBM 7030 -- a.k.a. Stretch -- the company's first transistorized supercomputer, and delivered the first unit to the Los Alamos National Laboratory in 1961. Capable of handling a half-million instructions per second, Stretch was the fastest computer in the world and would remain so through 1964.

Nevertheless, the 7030 was considered a failure. IBM's original bid to Los Alamos was to develop a computer 100 times faster than the system it was meant to replace, and the Stretch came in only 30 to 40 times faster. Because it failed to meet its goal, IBM had to drop Stretch's price to $7.8 million from the planned $13.5 million, which meant the system was priced below cost. The company stopped offering the 7030 for sale, and only nine were ever built.

That wasn't the end of the story, however. "A lot of what went into that effort was later helpful to the rest of the industry," said Turing Award winner and Stretch team member Fran Allen at a recent event marking the project's 50th anniversary. Stretch introduced pipelining, memory protection, memory interleaving and other technologies that have shaped the development of computers as we know them.

Lesson learned

Don't throw the baby out with the bathwater. Even if you don't meet your project's main goals, you may be able to salvage something of lasting value from the wreckage.

Knight-Ridder's Viewtron service

The Knight-Ridder media giant was right to think that the future of home information delivery would be via computer. Unfortunately, this insight came in the early 1980s, and the computer they had in mind was an expensive dedicated terminal.

Knight-Ridder launched its Viewtron version of videotex -- the in-home information-retrieval service -- in Florida in 1983 and extended it to other U.S. cities by 1985. The service offered banking, shopping, news and ads delivered over a custom terminal with color graphics capabilities beyond those of the typical PC of the time. But Viewtron never took off: It was meant to be the the "McDonald's of videotex" and at the same time cater to upmarket consumers, according to a Knight-Ridder representative at the time who apparently didn't notice the contradictions in that goal.

A Viewtron terminal cost $900 initially (the price was later dropped to $600 in an attempt to stimulate demand); by the time the company made the service available to anyone with a standard PC, videotex's moment had passed.

1 2 3 4 Page 1
Page 1 of 4
It’s time to break the ChatGPT habit
Shop Tech Products at Amazon