Skip the navigation

The Basics of Code

Building computers isn't so hard, but making them work is something else again.

By Gary Anthes
July 16, 2007 12:00 PM ET

Computerworld - We but teach bloody instructions, which being taught, return to plague the inventor.  Macbeth



Few distinctions in computer technology are as clear as the one between hardware and software. But it was not always so.

The term software first appeared in 1958, more than 10 years after the emergence of the first automatic, programmable digital computers. In the very early years  the 1940s  instructions to the adders, multipliers and logic gates of computers came from wires plugged into boards and physical switches set laboriously for each application. Or they were delivered by paper tape, the way Grace Hopper programmed the Harvard Mark I computer in 1944. With those programs, instructions resided outside the computer.

Then, in 1945, Alan Turing, John von Neumann and others had a better idea  one that would fundamentally change computer architectures forever. Their idea was to store the program inside the computer and to put it in the same place as the data. This stored program architecture allowed software to be changed as easily as data, without anyone having to manipulate wires or switches or punch a new paper tape.

The earliest software was written in binary notation  long strings of zeros and ones  but it was hard for programmers to recognize and remember instructions made up of 16 binary digits. So higher-level notations, called assembly languages, were invented. They substituted short mnemonics for the long binary strings.

Assemblers worked at low levels of abstraction, mostly translating the mnemonics into machine language instructions one-for-one, so the programmer still had to worry about machine constructs such as memory locations and registers. But they allowed programmers to give memory locations meaningful English names. An assembly instruction B Loop might tell the program to branch (transfer control) to the instruction at the memory location labeled Loop.

But that was still pretty tedious. Moreover, assembly language programs could not be ported across computers with different hardware architectures. The solution to those problems came in the mid-1950s in the form of higher-level languages  most notably Fortran and Cobol  and their compilers.

Unlike assemblers, compilers generated multiple machine-language instructions for every source statement written by the programmer. Compilers (themselves programs) were smart enough to flag certain kinds of errors in source code, and they could automatically generate the code required to do various housekeeping chores, like keeping track of memory locations and the indexes used in loops.

Portable Code

With these higher-level languages, programmers had to know little or nothing about the hardware their programs would run on. As a consequence, source programs became widely portable across computer types for the first time. That gave rise to the creation of open-source libraries of reusable software.



Our Commenting Policies
Consumerization of IT: Be in the know
consumer tech

Our new weekly Consumerization of IT newsletter covers a wide range of trends including BYOD, smartphones, tablets, MDM, cloud, social and what it all means for IT. Subscribe now and stay up to date!