The Story So Far

Keep it running. That was a primary goal of the technicians working on the Electronic Numeric Integrator and Calculator (ENIAC). The first large-scale electronic computer in the U.S. used 17,480 vacuum tubes. Early estimates suggested that one tube or another would be failing at any given time so that ENIAC would never complete a calculation.

But J. Presper Eckert, the genius behind the 30-ton behemoth, came up with a solution. Carefully testing tubes, he determined that they usually failed early or late in their lives. With a rigorous program of regularly swapping in new, tested tubes, Eckert eventually kept ENIAC running productively for more than 12 hours at a stretch.

That was computer systems management circa 1946.

By the mid-1950s, vacuum-tube and electrostatic memory were being replaced by magnetic core memory, which didn't burn out. Programming by plugs and wires was being replaced by programs called "software." And in 1955, programmers at the General Motors Research Center wrote the first operating system - a batch-processing monitoring program for the IBM 701.

That made it easier to keep mainframes running at capacity. But to keep mainframes from wasting time doing input and output, programs were written, punched onto cards, converted to tape and only then run on mainframes. The results were printed on a separate machine. And mainframes had to be manually reconfigured when switching between different programs.

But in 1964, IBM announced its System 360, a line of compatible mainframes designed to handle their own I/O and to run different kinds of software without reconfiguration. With hardware and an operating system that simplified many systems management tasks, data processing managers could begin to focus on optimizing system performance, not just on feeding the mainframe's tape drives.

In the early 1970s, core memory was replaced by dynamic RAM chips, and in 1973 IBM developed lower-cost, higher-capacity hard disk drives. Data processing shops needed the extra memory and storage because they were now running online transaction processing systems with hundreds or thousands of concurrent users.

Storage management became important. So did having a system for tracking problems and making changes - fixes couldn't simply be made between batch jobs, because the system wasn't allowed to go down. And data processing managers shifted their focus to improving system response time.

In 1974, IBM rolled out its System Network Architecture, a standard networking protocol for linking peripherals and terminals. But minicomputers from Digital Equipment Corp. and Hewlett-Packard Co. were going into use as departmental computers, complicating systems management. And in 1980, AT&T Corp. began issuing resellable licenses for Unix to other vendors, launching a wave of relatively inexpensive workstations and servers.

In 1981, the IBM PC arrived - followed less than a year later by the first demonstration of a PC LAN. Novell Inc. shipped NetWare in 1983, and 3Com Corp. was already shipping Ethernet and TCP/IP networks for PCs, workstations and servers. Now managing availability, capacity and performance was an issue for networks, too.

Networks and PCs also made it practical to decentralize data processing, complicating the job of managing systems. Client/server systems required multiple computers and the network to be tuned for performance. And security and disaster recovery became integral parts of systems and network management.

The 1990s saw the recentralization of IT into larger-than-ever data centers, along with more concerns than ever about capacity, performance and security as users connected via the Internet. The Y2k threat required IT shops to inventory and upgrade all their systems. And ever-increasing bandwidth requirements for multimedia and networked applications have blurred the lines between systems and network management.

More than 50 years after ENIAC, the goal for IT is to manage thousands of computers and a maze of networks as if they were a single system. But one goal is the same: to keep it running.

And now, on with the story . . .

1pixclear.gif
1964: IBM's System 360 simplifies systems management.
1964: IBM's System 360 simplifies systems management.

1946: ENIAC requires preventive maintenance just to keep running.

1955: General Motors Research Center programmers write the first operating system, a monitor program for the IBM 701.

1964: IBM's System 360 simplifies systems management.

1968: Bob Dennard of IBM invents dynamic RAM, cutting the cost and size of memory. Within a few years, it completely replaces magnetic core memory.

1973: Winchester hard disks cut the cost of storage.

1980: AT&T begins licensing Unix to computer vendors for resale on workstations.

1981: The IBM PC arrives.

1982: The first PC LAN is demonstrated by Drew Major, Kyle Powell and Dale Neibaur of Novell. Their software will eventually become NetWare.

1995: Commercial use of the Internet hugely increases the number of users of corporate systems.

1999: Y2k requires massive IT inventory and replacement of many systems, spurring new asset management approaches.
1968: Bob Dennard of IBM invents dynamic RAM, cutting the cost and size of memory. Within a few years, it completely replaces magnetic core memory.
1968: Bob Dennard of IBM invents dynamic RAM, cutting the cost and size of memory. Within a few years, it completely replaces magnetic core memory.

Special Report

Taking Control

Stories in this report:

Related:

Copyright © 2002 IDG Communications, Inc.

  
Shop Tech Products at Amazon