Why is Intel canceling Larrabee graphics chip?

Intel announced that it was killing the Larrabee GPU program late last week. It was late and slow, it seems. But in IT Blogwatch, bloggers wonder if there's more to the story.

By Richi Jennings. December 7, 2009.


Intel logo
Your humble blogwatcher selected these bloggy morsels for your enjoyment. Not to mention baroque anatomy...

    Rik Myslewski is the bringer of bad news:

Intel's Larrabee many-core CPU-GPU mashup is so far behind schedule it will not be ... a standalone product. ... The chip won't be released into the wild as anything other than a software development platform.


But ... the company told us it remained "committed to delivering world-class many-core graphics products," and it said it intends to reveal other plans for discrete graphics products sometime next year. ... [It] would not commit, however, to confirming if such architectural details as Larrabee's ring bus will make it into future products.

Jon Stokes rides the elephant:

[Intel's] long-delayed Larrabee discrete graphics product has suffered yet another delay, so the company has had to "reset" its overall GPU strategy and reposition plans and its expectations. ... Larrabee v1 is so delayed that, at the time it eventually launches, it just won't be competitive.


The main issue behind the delay, it appears, was the hardware. That's not surprising, because Larrabee is a big, complex part, and it's quite a departure from anything that Intel has done. The hardware delay would have resulted in a software delay, and if Intel were to launch Larrabee with an immature software stack then it would be roadkill.

But Tony Bradley has another theory:

Intel and Nvidia originally entered into a strategic alliance in 2004, agreeing to share patents and work together. ... The honeymoon ... is over, though. Intel filed a lawsuit against Nvidia claiming that the 2004 agreement does not allow Nvidia to develop or manufacture chipsets. ... Nvidia countersued claiming that it does.


With the collapse of the arrangement between the two, Intel loses access to Nvidia's considerable intellectual property related to graphics processing. That loss has a direct impact on development of the Larrabee Project. Without Nvidia, Larrabee is dead in the water.

Charlie Demerjian's not surprised:

This change of direction was pretty well assured because the first Larrabee chip was so late. If you are a year late in the GPU business, that is an unrecoverable deficit. While some may say that this is a failure, they probably don't understand the magnitude of the task that Intel undertook. It is not simply a GPU, it is the next generation of vector compute parts.


Larrabee 1 and 2 were very similar, and Larrabee 3 was a very different part. Given that, we would expect Larrabee 2 to be dropped as well, and effort to focus on Larrabee 3.

And Anand Lal Shimpi simply says it doesn't really matter:

Thanks to AMD's aggressive rollout of the Radeon HD 5000 series, even at lower price points Larrabee wouldn't have been competitive - delayed or not. ... Chances are that it's more than 6 months away at this point.


[Intel's] 45nm fabs are old news and paid off. ... [So] financially it's not a problem, yet. If Larrabee never makes it to market, or fails to eventually be competitive, then it's a bigger problem. If heterogenous multicore is the future of desktop and mobile CPUs, Larrabee needs to succeed otherwise Intel's future will be in jeopardy. It's far too early to tell if that's worth worrying about. .

Eric Savitz shows us the money:

The news could  provide a boost to shares of both Nvidia (NVDA) and Advanced Micro Devices (AMD), the dominant players in graphics chips.

Matthew Humphries agrees:

Nvidia and AMD will be very happy about this development, though. Everyone was waiting to see what Intel could do with Larrabee. ... Nvidia and ATi have been developing and perfecting their graphics processors for many generations and managing to produce a new chip on a par with them was always going to be tough.


If Larrabee isn’t going to plan can it really expect to deliver something capable without another few years of development time?

So what's your take?
Get involved: leave a comment.

And finally...

Richi Jennings, your humble blogwatcher
  Richi Jennings is an independent analyst/consultant, specializing in blogging, email, and security. A cross-functional IT geek since 1985, he is also an analyst at Ferris Research. You can follow him as @richi on Twitter, or richij on FriendFeed, pretend to be richij's friend on Facebook, or just use good old email: itblogwatch@richij.com.

Don't miss out on IT Blogwatch:

Copyright © 2009 IDG Communications, Inc.

Shop Tech Products at Amazon