It's no secret that computing is changing.
This is no longer a world of mainframes and desktop computers.Laptops, tablets and smartphones are ubiquitous. And soon they'll have company from wearables, smart homes, smart cars, cognitive computers and perhaps even quantum computers.
As the digital world expands at a breakneck pace, IBM is working now to create computer chips for a new digital age.
"What comes after the mobile phone and the smart phone? It's not a clear-cut answer," said Jon Erensen, a research director with Gartner Inc. "It's an array of simple to complex products. We'll need chips that are much more power efficient and much more powerful. We'll need chips with whole new architectures that work in different ways.... Chips need to keep changing with the applications they're going into."
IBM said yesterday that it will spend $3 billion over the next five years on research and development into computing and chip materials. In essence, this powerhouse in computing research is looking to rethink computer design.
As it looks to the future of the semi-conductor industry, IBM sees a lot of changes. What it may not see is more silicon.
"IBM engineers and scientists, along with our partners, are well suited for this challenge and are already working on the materials science and device engineering required to meet the demands of the emerging system requirements for cloud, big data, and cognitive systems," said John Kelly, senior vice president of IBM Research, in a statement. "This new investment will ensure that we produce the necessary innovations to meet these challenges."
IBM, along with industry analysts, say researchers should be working on not only new materials but new architectures for future processors. Silicon, the basis for today's chips, can only go so far.
As semiconductor manufacturers move from the current 22 nanometer (nm) architecture down to 14nm and then 10nm in the next several years, it's going to be increasingly difficult to build these shrinking chips. Seven nanometers may be the lower limit.
When manufacturers shrink chips to that size, their gates, which act as electronic switches, are only a handful of atoms thick. At that level of thinness, gates increasingly allow electrons to squeeze out between them. That means leakage, more heat and the need for more error checking processes.
In other words, scientists see the coming end to Moore's Law.
To continue building more powerful computers, or smaller devices that slip in a pocket or can be worn, or for cars that communicate with homes, we're going to need a better computer chip.
This is exciting news for Yehia Massoud, head of the electrical and computing engineering department at Worcester Polytechnic Institute.