45 years of creative evolution in the IT industry and beyond

As Computerworld celebrates its 45th anniversary, pundits and IT executives look back over decades of change that brought stunning technological advancements -- and put more power in users' hands.

How different is the world of computing now from when the first issue of Computerworld rolled off the presses in 1967?

Here's a glimpse: One day around that time, Edward Glaser, chairman of computer science at Case Western Reserve University, was giving some of his students a tour of the rooms that held the school's Univac 1107. As he stood in front of the computer's flashing lights, the sound of tape spinning in the background, Glaser said, "By the time you're my age, maybe 20 years from now, you'll be able to hold all this computing power in something the size of a book."

His students weren't impressed. "I remember us thinking, 'This guy is nuts,' " says Sheldon Laube, who recently retired as CIO of PricewaterhouseCoopers. Yet Glaser was, in fact, off by only a few years and several orders of magnitude in predicting the debut and the processing power of notebook computers.

Today, of course, the iPhone in Laube's pocket can do things that would overwhelm a Univac 1107 or any other multimillion-dollar computing behemoth of that era.

Thanks to the miniaturization of hardware, advances in storage processing, vast improvements in software and the proliferation of high-speed networks, computing now belongs to the people.

Over the past 45 years, "the overarching trend is consumerization," says technology pundit Esther Dyson, chairwoman of EDventure Holdings, an investment firm. The IT leaders who read Computerworld "used to own all the computers, and now [their] customers do."

This brings one practical change, she notes: more technology choices for users, who have always wanted access to information via any device and any operating system, and now expect it.

For IT, it creates a new master: "Your 3-year-old kid can do things with your cellphone you can't," says Suren Gupta, executive vice president of technology and operations at Allstate. "[IT] better be on that curve. Kids and consumers are learning technology much faster, and we need to make sure we adapt our products to reflect that."

Technologies are created to improve life. Corporations use technologies to become more efficient and improve their ability to give customers what they want. Some corporations -- those with foresight and flexibility -- use it to create entirely new ways of doing things.

Without a doubt, high tech has reshaped the world in the past 45 years. The most visible example comes from the smart devices that millions of us keep within easy reach. Personal digital assistants, indeed -- cellphones and tablets extend our beings into a realm no less real for being virtual. But it wasn't always this way.

Riding Moore's Law

"My father was working on computer programming and technology back in the '50s. He would come home and say, 'This is the hardest thing I've ever done. Whatever you do, stay away from these things,' " recalls Ray Lane, a managing partner at Kleiner Perkins Caufield & Byers, a Silicon Valley venture capital firm. Lane didn't listen to his father. After graduating from college, he became a systems analyst at IBM (he also did systems work in the military during the Vietnam War). By the early 1970s, he could write code in a formal language like Fortran ("Cobol was kind of for sissies," he says), submit a deck of punch cards and 24 hours later find out what mistakes he'd made.

Thanks to the relentless pace of Moore's Law, which posits that the number of transistors that can be put on a semiconductor will double every 18 months, the kind of computing power once available only to those who worked in austere information temples is now available in the palm of one's hand, says Lane. And today, those temples -- or data centers, as they're now known -- all look more or less the same: They're made of servers with Intel chips inside, and they boast vast storage resources. We connect to them from anywhere, ultimately through the Internet's protocol, TCP/IP.

Chris Perretta, CIO at State Street, remembers that he had to drop a microprocessor lab class when he was an engineering student in the late 1970s because he fried a CPU -- it was too expensive for him to get a second one. "People get mad now when [technology] breaks, and I'm amazed that it works ever!" he jokes. At this point, Perretta says, "we can build systems with basically infinite computing capacity and access to an incredible amount of data."

1 2 3 Page
FREE Computerworld Insider Guide: IT Certification Study Tips
Join the discussion
Be the first to comment on this article. Our Commenting Policies