Back to Basics: Algorithms

Computer scientists look for simplicity, speed and reliability. Sometimes they find elegance.

The word algorithm was derived from the name Al-Khwarizmi, a 9th-century Persian mathematician and author of The Compendious Book on Calculation by Completion and Balancing. But nowadays the word most often applies to a step-by-step procedure for solving a problem with a computer.

An algorithm is like a recipe, with a discrete beginning and end and a prescribed sequence of steps leading unambiguously to some desired result.

But coming up with the right answer at the end of a program is only the minimum requirement. The best algorithms also run fast, are sparing in their use of memory and other computer resources, and are easy to understand and modify. The very best ones are invariably called "elegant," although Al-Khwarizmi may not have used that term for his formulas for solving quadratic equations.

An algorithm can be thought of as the link between the programming language and the application. It's the way we tell a Cobol compiler how to generate a payroll system, for example.

Although algorithms can end up as thousands of lines of computer code, they often start as very high-level abstractions, the kind an analyst might hand to a programmer.

For example, a lengthy routine in that payroll system might have started out with this algorithmic specification: "Look up the employee's name in the Employee Table. If it is not there, print the message, 'Invalid employee.' If all other data on the input record is valid, go to the routine that computes net pay from gross pay. Repeat these steps for each employee. Then go to the routine that prints checks." The gross-to-net and check-writing routines would have their own algorithms.

The word algorithm is named after mathematician Al-Khwarizmi.

Reality Intrudes

Of course, it isn't quite that simple. If it were, the study of algorithms would not have become a major branch of computer science and the subject of countless books and doctoral theses.

But it's not hard to imagine computer engineers in the 1950s thinking they had pretty much finished the job. They had invented stored-program electronic computers, and languages like Fortran and Cobol to run on them, and they had largely banished the agony of assembly language programming. In fact, software pioneers such as Grace Hopper saw compilers, and the algorithms that instructed them, as such an advancement -- they could "understand" English -- that they named the first computer to use one the Universal Automatic Computer, or Univac. With adjectives like "universal" and "automatic" in its name, the computer could almost be expected to program itself.

1 2 Page 1
Page 1 of 2
Bing’s AI chatbot came to work for me. I had to fire it.
Shop Tech Products at Amazon