Memory management
Until the past decade or so, RAM and storage were amazingly limited, so efficient programming required impressive but time-consuming work-arounds. For example, developers had to spend a lot of time keeping track of memory allocation and deallocation -- a process later known as "garbage collection." Otherwise, they spent a lot more time fixing memory leaks (unintentional memory consumption) that eventually crashed the computer.
Early garbage-collection routines, such as those in the initial Ada compilers, essentially froze the computer while they cleared up memory; that wasn't useful behavior for developers trying to write software for airplane cockpits. Today, just about every development environment has a decent garbage collector.
But PC memory management required even more attention to detail. Instead of using an application programming interface (API) to write to the screen, you'd put characters on the screen (along with their attribute bytes) by memory block copy
operations, writing directly to the hardware. On the Apple II, remembers one programmer, you wrote "bit blitters," which combined two or more bitmaps into one, to handle the Apple's strange graphics memory mapping.
Punch cards and other early development environments
Today, your typing speed probably matters more than your keystroke accuracy. That wasn't so during an era (through the 1970s at least) when everything was keypunched on Hollerith cards. Hit the wrong key, and the card was ruined; you had to start over or try to fix it with Scotch tape and tiny pieces of paper.
Most developers learned to include line numbers even if the language didn't need them so cards could be recollated after they were (inevitably) dropped. Cards were available in assorted colors, allowing color coding of different sections of a deck such as JCL (Job Control Language), program source and data. Another trick was to write the program name in magic marker across the deck, giving you a quick visual clue if a card was out of order.
Honorable mention:
- Non-WYSIWYG editing platforms. Some of us remain comfortable with
vi
/emacs
, command-line compile options or.nroff
for documentation formatting, but initially we programmers didn't have a choice. - Eight-character limits on file names, which sure made it hard to write self-documenting code.
- The APL keyboard. APL was a great programming language for its time, but its symbols required a special keyboard and were even harder to remember than the most useless Windows icons.
- Memory dumps. If your code crashed, the mainframe spit out at least 100 pages of green-bar printout showing the entire contents of memory. You had to sift through the entire tedious listing to learn, say, that you had attempted to divide by zero, as Elkes wryly recalls. Expert programmers learned the debugging technique of filling memory with DEADBEEF (a "readable" hexadecimal value) to help them find a core-walker (the mainframe equivalent of a memory leak).
Pointer math and date conversions
Like sorting and hash algorithms, all math functions were left up to the developer in an era when CPUs were limited (until the '90s). You might need to emulate 16-bit math instructions on an 8-bit processor. You wrote code to determine if the user's PC had an 8087 math co-processor, which gave a serious performance improvement only if code was explicitly written to exploit it. Sometimes you had to use bit-shifting and lookup tables to "guesstimate" floating-point math and trigonometry routines.
It wasn't just straight-up math, either. Every developer had to do date math (what's "three weeks from today"?) using Julian date conversions, including figuring out when Easter falls and adjusting for leap year. (Not that every developer has figured out leap year even now).