Skip the navigation

Happy birthday, x86! An industry standard turns 30

Intel's x86 microprocessor architecture has dominated large swaths of computing for three decades. Here's why.

By Gary Anthes
June 5, 2008 12:00 PM ET

Computerworld - Thirty years ago, on June 8, 1978, Intel Corp. introduced its first 16-bit microprocessor, the 8086, with a splashy ad heralding "the dawn of a new era." Overblown? Sure, but also prophetic. While the 8086 was slow to take off, its underlying architecture -- later referred to as x86 -- would become one of technology's most impressive success stories.

"X86" refers to the set of machine language instructions that certain microprocessors from Intel and a few other companies execute. It essentially defines the vocabulary and usage rules for the chip. X86 processors -- from the 8086 through the 80186, 80286, 80386, 80486 and various Pentium models, right down to today's multicore chips and processors for mobile applications -- have over time incorporated a growing x86 instruction set, but each has offered backward compatibility with earlier members of the family.

In the three decades since the introduction of the 8086, the x86 family has systematically progressed from desktop PCs to servers to portable computers to supercomputers. Along the way, it has killed or held at bay a host of competing architectures and chip makers. Even some markets that had seemed locked up by competitors, such as Apple's use of Motorola PowerPCs in the Macintosh computer, have yielded to x86 in recent years.

How did Intel's architecture come to dominate so much of the computing world? Let's take a look.

In the beginning

Intel's first microprocessor was the 4-bit 4004, which was made for a Japanese calculator in 1971. That was quickly followed by the 8-bit 8008 and in 1975 by the 8-bit 8080 chip. The 8080 went into the Altair 8800 PC, which was sold as a mail-order kit. (Bill Gates and Paul Allen founded Microsoft Corp. to sell their version of Basic for the Altair 8800.)

Intel, memory maker?

Intel made its first microprocessor in 1971 in response to a request from Busicom, a Japanese calculator maker. But Intel's founders in 1968 had semiconductor memory foremost in mind, and such a chip became the company's first product, in 1969.

For almost 20 years, Intel focused on memory products. But by 1984, according to Albert Yu, a retired Intel senior vice president, the company was getting killed by the Japanese memory makers. It was deriving 100% of its profits from microprocessors but spending 80% of its R&D budget on memory. "Our strategy and investments were completely out of line with reality," Yu recalls in his book, Creating the Digital Future. The following year, Intel reluctantly exited the memory business.

Yu recalls: "We finally overcame the emotional burden of letting go of a failing business that we had invented and focused all our energy on the business we would build our future on. It was tough. It was gut-wrenching. But it was right." The following year, Intel's sales dropped from $1.6 billion to $1.2 billion, and the company lost $250 million from restructuring.

Three years later, the 16-bit 8086 made its debut. IBM's selection of the 8088, an 8086 variant, to power its PC in the early '80s gave the x86 architecture tremendous momentum and helped it become an industry standard that persists today.

Patrick Gelsinger, electrical engineer, chip designer and now executive vice president at Intel, says the critical turning point for the PC industry -- the thing that really sent the industry into overdrive -- was the introduction of the 32-bit 80386 in 1985. It was not obvious at the time that the x86 needed to be upgraded from the 16-bit address space of the earlier models, he says. "People said, 'What do you mean 32 bits? That's for minicomputers and mainframes.' They derided us at the time for being extravagant."

At about the same time, Compaq Computer Corp. announced a 386-based PC, which lessened IBM's death-grip control of the personal computer market. The IBM PC at the time ran the 16-bit 80286, which was more than three times slower than the 386.

According to Intel, IBM spurned the 386 because there was not yet any 32-bit software to take advantage of it. IBM was also developing a proprietary 16-bit operating system called OS/2.

x86 memories

Computerworld staffers have shared memories of our early x86 machines in the comments area. Now tell us yours!

© Fotolia / James Blacklock

"IBM owned the architecture from top to bottom. It was their applications, their operating system and their hardware design," says Gelsinger, who was a member of the 386 design team. "When they went to the next generation, they would be the only company able to offer the top-to-bottom solution, with no guarantee of compatibility from one generation to the next."

All that changed with the advent of the 386, Gelsinger says. "We moved from a vertical industry to a horizontal industry, and that really opened up the world."



Our Commenting Policies