Gordon Moore looks back, and forward, 40 years
Intel's co-founder offers advice to the industry
IDG News Service - Forty years after he coined the most famous law in computing, Gordon Moore still has a few words of advice for the industry.
For software developers: Simplify! Your interfaces are getting worse. Nanotechnology? Don't believe the hype; silicon chips are here to stay. Artificial intelligence? Try again, folks! You're barking up the wrong tree.
Speaking by telephone from Hawaii, where he now lives, Moore fielded an hour of questions from reporters yesterday to mark the approaching 40th anniversary of his celebrated prediction -- that the number of transistors on integrated circuits would double roughly every two years.
Christened later as Moore's Law, his observation became something of a self-fulfilling prophecy for the industry, he said, driving computer makers to keep pace with the expected rate of advancement. But he was too humble to claim credit for a phenomenon that effectively made possible the rapid evolution of modern electronics.
"If I hadn't published this paper in '65, the trends would have been obvious a decade later anyway. I don't think a particular paper made a difference. I was just in a position where I could see the trend," he said.
Moore, now 76, was director of research and development at Fairchild Semiconductor when his paper was published in Electronics Magazine on April 19, 1965. Three years later he founded Intel Corp. with Robert Noyce, becoming its CEO in 1975 and its chairman four years after that.
His law had little effect at first, he said. The first big impact he recalls occurred when Japanese manufacturers entered the memory chip business in the 1970s. For a while, the Japanese struggled to find their step in a business where the technology seemed to advance in an unpredictable fashion.
Moore said he reread his paper about a year ago and was pleasantly surprised to find that it also foresaw the use of computers at home, although he had forgotten that he had made that prediction by the time the first home computer appeared. In fact, as CEO of Intel years later, he would dismiss home computing altogether.
"An engineer came to me with an idea about a home computer," he recalled. "I said, 'Gee, that's fine but what would you use it for?' He could only think of a housewife using it to keep recipes on. I
- Silicon Valley's 19 Coolest Places to Work
- Is Windows 8 Development Worth the Trouble?
- 8 Books Every IT Leader Should Read This Year
- 10 Hot Hadoop Startups to Watch
- Slideshow: 7 security mistakes people make with their mobile device
- iOS vs. Android: Which is more secure?
- 11 sure signs you've been hacked
- Infographic: Converged Infrastructure Benefits This Infographic quantifies the savings organizations are realizing from increased deployment speed, higher availability, and lower annual costs.
- CIOs Deliver Productivity Breakthroughs with Intelligent Digital Signage Retailers have long recognized the influence that digital signage provides over a shopper's point-of-purchase decision making process.
- Going Paperless? Here's What You Need to Think About As makers of some of the world's most popular PDF solutions, we often consult with businesses & governmental agencies that have the goal...
- The Big Data Opportunity for HR and Finance If CEOs, CFOs, CIOs, and CHROs want to drive their businesses forward, they will need to quickly recognize the enormous value of big...
- Redefine Your IT Operations: Remote Office IT Has Never Been Simpler Join us to see why PC Pro named Dell PowerEdge VRTX the "2013 Server of the Year." PowerEdge VRTX may be just what...
- Top 4 Digital Signage Fails Join RMG Networks for a look at four of the most common reasons digital signage fails in corporate businesses. Learn about strategies to... All Hardware White Papers | Webcasts
Our new weekly Consumerization of IT newsletter covers a wide range of trends including BYOD, smartphones, tablets, MDM, cloud, social and what it all means for IT. Subscribe now and stay up to date!