Sidebar: Future Chips: Hundreds of Threads

Return to the Special Report
Chip makers are already creating designs that stretch beyond two cores.

Just last month, Raza Microelectronics Inc. in Cupertino, Calif., began shipping its XLR processor line with up to eight cores operating at a frequency of up to 1.5 GHz. Each processing core supports four threads, or simultaneous instructions, for 32 total threads.

The product family is aimed at converged networking and computing applications, according to the company. It will reportedly be in manufacturers' equipment late this year.

Sun's 90-nanometer UltraSparc IV chip for high-end servers, code-named Niagara and due in 2006, will also reportedly support up to eight cores, each handling four threads.

Meanwhile, as the industry moves to 65nm technology, more transistors will fit on the same silicon real estate. "Operating them at a slightly lower frequency and voltage allows chips with four or more cores to run without significantly increasing the power envelope," explains Jeff Austin, product marketing manager in Intel's business client group.

Intel's Tukwila processor, the first in the company's 65nm Itanium processor family for multiprocessing servers, will contain four or more cores and is due around 2007.

Austin says he also expects that between 2008 and 2010, Intel will ship single-processor implementations supporting up to eight threads using parallel-execution cores or cores with symmetric multithreading. On the server side, that translates into 32 parallel threads, he says.

"In the next 10 years or so, we're looking at tens to hundreds of cores within a processor, including special-purpose and asymmetrical cores, to deliver hundreds and thousands of parallel-capable threads," Austin says.

Real-World Translation

Driving the multicore activity is the desire to achieve greater performance with as much power efficiency as possible. In cell phones and mobile computers, this means "being able to do more-complex applications without running out of battery really fast," says Sven Behmer, CEO of Foster City, Calif.-based PolyCore Software Inc., which offers multicore software development and management tools.

On the desktop, says Austin, the need to run virus scans and other hefty programs in background mode while maintaining strong foreground response time is pushing performance demand, as is the need to manage streaming high-definition content.

"Games can take advantage of greater degrees of parallelism and threading for ... using artificial intelligence instead of predefined actions [to calculate what happens] when you face off with a robot or to compute how glass breaks when you jump through a window," says Austin.

On the server side, Intel also sees multicore as benefiting the enterprise trend toward data center consolidation. In a virtual computing environment, enterprises can merge several software environments onto fewer machines and distribute tasks across different virtual machines.

Behmer observes that having multiple processors on a chip may make it possible to increase the yield in the manufacturing process. "If one of four is defective, you could sell a two- or three-processor chip," he says. "So you could use a higher percentage of what comes out of the process."

Multicore systems offer innovation for gaining performance advantages but also affect existing software deployments and developer skills.

For example, Ken Kennedy, director of the Center for High Performance Software Research at Rice University in Houston, notes that "many applications will need to be parallelized if they want to see performance boosts on such processors."

Development has some challenges, however. Most operating systems are designed to run on a single processor. And while symmetric multiprocessing is relatively simple in that the operating system handles the task of load balancing, says Behmer, asymmetric multiprocessing leaves the partitioning of tasks across multiple threads to the developer.

An increasing number of processors -- including hardware accelerators, digital signal processors, filters and other specialized components -- are showing up. For that reason, Behmer says he expects the importance of memory, as well as connection subsystems for moving data as efficiently as possible among them, to increase. He predicts that as a result, a "network on a chip" -- such as a mesh of wires among components or multilevel buses -- will likely emerge within a decade or so.

Copyright © 2005 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon