Intel CEO says Larrabee graphics chips aren't dead yet

Intel CEO says Larrabee graphics chips aren't dead yet

Summary: For all of its success, one thing Intel could never master was the standalone graphics card, eventually ceding the market to Nvidia and ATI/AMD. That was supposed to change with the company's Larrabee project, which was designed to produce 3D graphics chips that could compete with the GeForces and Radeons of the world.

SHARE:
8

For all of its success, one thing Intel could never master was the standalone graphics card, eventually ceding the market to Nvidia and ATI/AMD. That was supposed to change with the company's Larrabee project, which was designed to produce 3D graphics chips that could compete with the GeForces and Radeons of the world.

Somewhere along the way last year, however, Intel realized that it wasn't near ready to start producing Larrabee hardware, and scaled back expectations, saying that the project would now serve as a software-based development platform instead. But CEO Paul Otellini recently suggested that the concept of Larrabee-based hardware isn't dead after all, and said the confusion came from some people in Intel disclosing details about the project earlier than they should have. He told investors that Larrabee was taken off the firm's roadmap essentially to give it more breathing room over the next few years.

It seems clear that Intel has backed off on its previous hints that Larrabee could eventually compete with Nvidia and ATI consumer products in terms of gaming performance, and Otellini emphasized Larrabee's potential for general-purpose computing tasks. This jibes with ATI and Nvidia's efforts to promote the GPGPU technique of its latest boards using the Stream and CUDA platforms, respectively.

There's always the chance a far-off Larrabee chip could wind up in a future videogame console (it was rumored to be in a next-generation Nintendo console and the eventual PlayStation 4), but it looks like the competition between ATI and Nvidia for PC graphics supremacy will remain a two-horse race for the short term.

Topics: Intel, Hardware, Processors

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

8 comments
Log in or register to join the discussion
  • Intel please bring out full featured 2D/3D larrabee graphics

    with a H.264 codec (encoder too). Please integrate with your CPU, on chip with the same nm. Come out with UHDTV multi-mon capability with better perf than ATI/NVidia and eliminate our need for them.
    GO INTEL!!!
    Johnny Vegas
    • RE: Intel CEO says Larrabee graphics chips aren't dead yet

      @Johnny Vegas<br><br>You might as well be asking Toyota to make their cars capable of running at 100mph with better performance than a fighter jet.<br><br>ATI/nVidia video cards are capable of H.264 at pretty much any framerate and resolution you want. If they can run games at 16xAA (a virtual resolution of 30720x17280 on a 1080p monitor), I'm pretty sure they can handle H.264 on UHDTV.
      CobraA1
  • Come on, already

    All of our systems at work use Intel motherboards which have dual head video on board. These are not gaming machines, but office computers for which the on board graphics are more than sufficient. Our problem is that the trend is moving to more monitors. With most of our users now having two, we are seeing more requests for 3 or more monitors. 4 head video cards are expensive and finicky, and when we add a discrete video card it disables the on board graphics.

    Intel, how about introducing a discrete graphics card that will work in conjunction with the on board graphics? That would allow us to move up to 3 or 4 monitors easily and inexpensively. Let nVidia and AMD fight it out for the gaming market, while you dominate the far larger business market.
    itpro_z
  • Problem is . . .

    Problem is, GPUs were already becoming less specialized and more general purpose when Intel decided to start working on their Larrabee.

    Both AMD and nVidia now have full support for compute shaders on their latest GPUs, and they're not afraid of turning out new designs quickly in response to competition.

    Intel has always done pretty poorly with graphics. All of their chipsets are pretty much barely capable of video, and horrible for gaming.

    They have a very small number of chipsets capable of hardware T&L, while every new ATI and nVidia card sold since 2001 has it, even on the low end.

    It's pathetic. It's a standard feature even on the cheapest video cards. Yet the vast majority of Intel integrated chipsets still don't have it.

    Sorry, but Intel just doesn't have any experience with graphics beyond basic video decoding. They just don't have it.
    CobraA1
  • The majority don't need heavy graphics

    If your only a web surfer and Office suite person. Even the occasional Sims3 addicts. You do not need a expensive dedicated graphics card. Even Apple knows this yet they still seem to charge plenty for a mediocre dedicated card. I for one have had less trouble then those who have bought Nvidia graphics in their laptops only to find they over heat and die or have die defects and your replacing the board.
    jscott418-22447200638980614791982928182376
    • RE: Intel CEO says Larrabee graphics chips aren't dead yet

      @jscott418

      Well, if you're the "occasional" Sims3 addict, then you need something at least capable of running the game.

      I do think some people underestimate the size of the gaming industry - most people I know seem to play some sort of game, be it on a PC or console. Even if they only play mostly a single game, they're still gamers.

      I think it's bigger than you realize. I suppose it might not be a majority, but it's certainly large enough to allow for multi-million dollar budgets. Like it or not, they're still large enough to warrant that manufacturers pay attention to them and design components to cater to them.
      CobraA1
  • Intel FUD

    Intel has never been able to produce a competitive discrete graphics card. And from the looks of things it never will. This latest announcement sounds like pure FUD, a desperate attempt to make Intel relevant in the minds of investors and hold off people jumping to the competition. Much like the utter failure of IA64 in the market, whose only real victory was driving competitors out (DEC Alpha, HP PA-RISC, Sun, etc).

    Don't buy Intel's garbage. I'll believe it when I see it (something about h*ll freezing over...)
    chefp
  • All these comments talk about the PAST....

    Otellini just said that Intel wants to target the use of their GPU technology for *general computing tasks*, and also that discrete cards are *OFF THE ROADMAP* for a few years in the future. But you're all still talking about games, and discrete graphics -- used for the purpose of running displays. (As if you didn't READ THE ARTICLE.)

    It's currently a complete nightmare to re-architect a massively parallel application to use Stream or CUDA, and the interface between the "real" program, running on the "real" CPU, and the "worker", Thread-like programs, running in the GPU's, is extremely messy.

    Intel makes really great compilers, and their MeeGo SDK is likely to allow for much faster and cheaper migration migration projects. I think that the targets include both CPU-bound, massively parallel applications (weather forecasting; oilfield modeling; aircraft design; nuclear simulations; CGI for movies, etc.) AND the SoC market (telephones, netbooks, and home appliances).

    Both market areas are huge: The "massive parallel" supercomputer market is huge because of the number of CPUs/GPUs included in each system, and after Intel gets their "Atom" SoC fully integrated, and capable of high-quality graphics, the shear number of cars, home appliances, netbooks, and smartphones in which it could compete is enormous.

    Paul said "software", and none of you other readers paid attention.
    Rick S._z