Intel Corp. is honing its sights on multicore chips far more powerful than today's dual- and quadcore processors.
As expected, Intel took a big step in that direction today by unveiling a 48-core research chip that it says is 10 to 20 times more powerful than the current top-end offering in its multi-core Core line of processors. Intel also noted that the experimental chip uses the same amount of energy as two household light bulbs.
With its eye on the data center and the cloud, Intel built fully functional cores in the new chip as part of what it calls its "terascale" mission.
"With a chip like this, you could imagine a cloud data center of the future which will be an order of magnitude more energy efficient than what exists today, saving significant resources on space and power costs," said Justin Rattner, Intel CTO and head of Intel Labs. "Over time, I expect these advanced concepts to find their way into mainstream devices, just as advanced automotive technology such as electronic engine control, air bags and anti-lock braking eventually found their way into all cars."
Today's unveiling of the 48-core research chip comes about two years after Intel showed off an experimental 80-core chip. That research chip had teraflop performance capabilities but used less energy than a quadcore processor.
The 80-cores were not fully functional, however, and the chip was used mainly to study ways to make a large number of cores communicate efficiently with each other, as well as help Intel engineers find new architectural and core designs.
At the time, Intel officials said that the company was five to eight years away from building a fully functional, commercial-ready 80-core chip.
In an interview with Computerworld last month, Rattner said that schedule has changed and that engineers are even closer to developing such a chip.
Intel reported today that it is bringing academics and experts from other high tech firms into the loop by distributing 100 of the experimental 48-core chips so researchers can work on programming models and on developing software that can run on such a high number of cores.
The chip maker, which is slated to unveil six- and eight-core Nehalem chips next year, also noted that it expects to integrate key features of the research chip into a new Core line of commercially available processors by early 2010.
"This is an indication that Intel can deliver on its multicore strategy," said Rob Enderle, an analyst with the Enderle Group. "It's very important in that it helps validate what Intel contends can be done and it adds credibility to their roadmap. The 80-core chip was more for bragging rights and was more of a science experiment. This one is more of a prototype -- less flashy but more functional. It is all part of the process of bringing something new to market."
Intel reported that the 48-core chip is designed with a high-speed, on-chip network for sharing information, along with newly invented power management techniques that allow it to operate at as little as 25 watts, or at 125 watts when running at maximum performance.
The company dubbed the experimental chip a "single-chip cloud computer" because its architecture resembles that of a cloud computing data centers.
Dan Olds, an analyst at The Gabriel Consulting Group, said the new research chip is an important step in the process of building multicore processors, but Intel should release more details about the technology.
"This is a fairly important step in the evolution of computer processors," said Olds. "Multicore chips have become the standard with dual-core and quadcore chips used in almost every system. Core counts will definitely increase over time, but it's happened in small steps -- two to four cores, four to six cores and with eight and 10 cores coming in the next few years. This 48-core chip from Intel is important from a proof-of-concept perspective."
But Olds said that Intel should explain what "fully functional" really means.
"We need more information from Intel in order to understand just how big a leap forward this chip really is," he added. "For example, can it handle the standard x64 instruction set? This is important in that it determines if existing software will be able to run on it without being ported. But we don't want to get ahead of ourselves. At a volume of only 100 units, this is more of a science project than an actual prototype. It's an important science project, one that might be giving important industry stakeholders a usable piece of the future, but still a science project."
Enderle added that such research is critical for the whole cloud computing movement.
"This technology is a requirement if the concept of cloud computing is ever to reach its potential for cost savings and energy efficiency," he said. "Without it, much of what we imagine with the massively flexible and utility like concept of cloud computing simply won't be possible. This is a very important milestone for the future."