Intel Iris could be a real threat to AMD and Nvidia

Intel Iris could be a real threat to AMD and Nvidia

Summary: Intel is preparing to unleash its fourth-generation of Core processors featuring GPUs that are 50 percent faster than found in previous-generation hardware.

SHARE:
TOPICS: Intel, Processors
18

Chip giant Intel is getting ready to put pressure on the dedicated graphics card business with its new Iris brand of graphics processors that are be built into next-generation Haswell chips.

Haswell is Intel's codename for its fourth-generation of Core processors that are slated to be unveiled in June and it is key to Intel's future plans, because unlike current-generation Ivy Bridge processors, it has been designed from the ground up to be very power efficient. In fact, according to Intel CEO Paul Otellini, Haswell's 22-nanometer processor will deliver "the single largest generation-to-generation battery life improvement in Intel history". As the PC industry is forced to transition from power-hungry desktop systems to notebooks and tablets, and the company is keen to squeeze as much runtime out of battery packs as possible, Intel hopes that Haswell will be at the core of these devices.

However, it's not just extra battery life that these processors bring to the table. This hardware also packs a punch, bumping CPU performance by 10 percent and GPU performance by 50 percent, giving users performance and good battery life. GPUs integrated into the processor also means that OEMs can build smaller, lighter, and thinner devices that require less cooling.

In an attempt to push integrated graphics into the mainstream, Intel has decided, wisely I think, that rebranding is needed. However, in order to not pollute the brand, Intel is reserving the Iris name for higher-voltage laptops and desktops. Intel wants Iris to be the brand that people look out for when they want the best performance possible from integrated graphics.

(Image: Intel)

While this is not going to be of much concern to hardcore gamers using expensive desktop system — these consumers don't care about power efficiencies — that's a small niche audience. The target for Intel's Iris assault are those who are looking to buy budget to mid-range notebooks, ultrabooks, or desktops, and want something that will offer a decent gaming experience.

(Image: Intel)

This market, unlike the hardcore gamer market, is massive, and one that is currently dominated by AMD's Radeon and Nvidia's GeForce brands. Intel's HD graphics brand — which was what its previous generations of GPUs integrated into Core processors was called — failed to achieve much in the way of traction. But the rebranding, especially when combined with a focus on the high end, could make Intel graphics more desirable.

(Image: Intel)

OEMs will also appreciate more GPU power built into processors, because it means being able to oust discrete graphics chips out of builds, allowing for cheaper systems because they don't need to include a separate GPU, saving not only on the cost of the component, but also on assembly and support. Given how consumers are more eager to spend their money on tablets and smartphones, this is just what the stagnating PC industry needs.

There is one caveat to bear in mind, which is that Intel has been promising to revolutionize integrated graphics for years, and so far, it hasn't delivered on that promise.

Topics: Intel, Processors

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

18 comments
Log in or register to join the discussion
  • conflating and confusing differing concepts

    "Chip giant Intel is getting ready to put pressure on the dedicated graphics card business"

    Let's see - dedicated graphics card - that's desktops. (laptops don't have a card, mobile is not quite dedicated)

    "because unlike current-generation Ivy Bridge processors, it has been designed from the ground up to be very power efficient."

    . . . which doesn't really matter for desktops, and furthermore means nothing for raw performance. Power efficiency is nice, but competing with AMD and nVidia means a large jump in performance as well.

    "However, it's not just extra battery life that these processors bring to the table. This hardware also packs a punch and bumps CPU performance by 10 percent, and GPU performance by 50 percent, giving users performance and good battery life."

    Intel needs far more than a 50% boost to compete with dedicated graphics. That's like saying that adding 50% more to a cup of water makes it the same size as a swimming pool.

    "This market, unlike the hardcore gamer market, is massive"

    The hardcore gamer market is larger than you think.

    "and one that is currently dominated by AMD's Radeon and Nvidia's GeForce brands."

    The hardcore gamer market is dominated by Radeon and GeForce. Other gamers, not so much. You can certainly get by with Intel if all you're playing is Angry Birds. But you'll need Radeon or GeForce if you plan on playing FarCry 3.

    "There is one caveat to bear in mind, and that is that Intel has been promising to revolutionize integrated graphics for years, and so far it hasn't delivered on that promise. "

    Agreed. Remember Larrabee? Lots of promise, but never appeared.

    Intel has *not* proven that they are good at graphics. Overpromising and underdelivering are the norms for Intel graphics. So yeah - this definitely falls under "I'll believe it when I see it."
    CobraA1
    • Huh?

      Of course laptops have dedicated GFX cards! (The ones worth buying anyways) I think the rest of your arguments were based on the fallacy that they did not have them. The energy requirement is definitely a factor. Even on desktops its a factor if youre using multiple cards in SLI and multiple HDD etc. Every computer iver built I have had to calculate power consumption so i knew what sort of power supply to build the thing around.

      I wont even get into the physics about how power affects the heat given off by all of these things, and that is of course a factor in all computers.
      RedSoldat
      • Why can't the Intel HD 4000 run WoW at maximum detail?

        "Of course laptops have dedicated GFX cards!"

        It's usually on the motherboard rather than a separate card, though. A minor quibble.

        The major point being - Intel has never proven their competence when it comes to realistic 3D graphics. My father has an Intel HD 4000, I have a GeForce GTS 250.

        My dedicated card is basically considered ancient history compared to the Intel graphics, and it's still pushing higher framerates and higher detail levels. I can push World of Warcraft to the highest detail level on my system without issues, but on my father's system I can't even push the detail level halfway.

        And that's on a game eight years older than the HD 4000. How does a graphics chipset eight years newer than a game NOT run it at maximum detail? That boggles the mind.

        In the meantime, nVidia and AMD have been pushing generation after generation of graphics cards, continuing Moore's law. If the Intel HD 4000 can't hold a candle to my GTS 250, can you imagine how it compares to something more recent?

        I've never been impressed with Intel's attempts at graphics. So yeah - it falls under the category of "I'll believe it when I see it."
        CobraA1
        • Just saying

          There are plenty of people who have dedicated cards on their laptops. Im an nVidia guy myself and never have found a game I couldnt run on acceptable GFX levels. I persoanlly havent used an intel brand dedicated card. It would be more interesting if those charts provided in the article compared their card to competitors like AMD and nVidia instead of showing off how much they improved over themselves. Their previous models sound like they were sorry anyways so "75x" increase and "2x" better performance dosnt mean much.
          RedSoldat
  • Not for serious gaming but

    It is some serious progress nevertheless. Still though I do not see anything in here that would make me want to replace my X1 Carbon (Intel HD 4000) for this new CPU. Or any other of my old laptops with quad core CPUs and dedicated NVidia mobile cards.

    I may be wrong but I think the gaming market is split between the “I do not care, I just want to play some simple games” and the “I do care and I am going to buy some serious dedicated graphics card”. People in the middle do not know what they are buying; they are just buying anything that is current.

    I think the one thing that is missing from the market is applications that can take advantage of all this CPU/GPU power in a meaningful way for the average user. We have lost the point of reference and each upgrade is less and less meaningful.

    And if we go back to my "favourite subject", i.e. Windows 8, it does make the desktop experience more flat, uglier and less demanding for the graphics card. So what is the point...

    I think hardware wise we are long overdue for some serious tech innovation (e.g. 3D memory), anything that will instantly obsolete all these machines I have at home and will give me an excuse to throw them away (so my wife will let me buy something new).
    mil7
  • While it helps in the gaming.... think HPC!

    Intel is hoping for "good enough" in the gaming department for laptops. And it may well be.

    The bigger prize I think will be HPC because this GPU is better at general purpose GPU computing. This in company with Haswell's new SIMD instructions (AVX) means that the FPU is also twice the performance. With some Haswell processors churning nearly 500 GFlop/S in double precision (32Flops per cycle per core). That's about half as fast as an expensive dedicated GPU but without the problematic coding required and lack of access to main memory. This is pretty impressive. It means that while still not as fast you can do much easier coding. Some compilers can likely detect and utilize the AVX instructions automatically. CUDA and DirectCompute and other HPC APIs make it easier to code for GPU's but not that easy. Now you can do x86 instructions and get really good floating point performance. If you need parallelism then you can rely more on the Iris GPU.
    MeMyselfAndI_z
    • In that case

      AMD has a better offering with hUMA.

      http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/

      Of course I am talking about the technology, not performance of the CPU etc.
      mil7
  • .

    The reason intel's pushing GMAs is because as AMD, they realized that miniaturization can't yield performance increases forever, as there was an article about the end of Moore's law, obviously for a long time now, the biggest performance gain was made with the smaller and smaller manufacturing process, but there's not much to go forward from 22nm, GPGPU is the only way to actively increase computer performance, as you've probably all seen with the push of OpenCL, DirectCompute and Cuda. The problem with those is that they don't contain a unified approach to use the graphics hardware and also the physical distance and hw separating the GPU from the CPU, that is what AMD is trying to take advantage of with their new APUs. Now a lot of people usually hate on them, but their price to performance ratio is way above Intel's and they do own a key asset to this race, ATi. By building GPUs and CPUs on the same die, they actually made it possible to leverage the power of both, from a unified API, making it directly programmable without additional API/HW layers. Now as I'm no expert to programming GPGPU apps, but the research I've done has taught me that you cannot use Cuda/DC/CL universally due to limitations and this changes it all.

    To all the disbelievers, the technology actually works pretty good, as evidenced by the recent PS4 announcement, where they basically announced that they will use AMD APUs. (And probably MS will use something similar too in their new Xboxes)

    Anywho, back to Intel. I personally hate GMAs as their performance is really bad compared to anything else, all they are good for is to provide a low-consumption video output for laptops and accept buffers like Optimus, worse than that tho, in any setup, they are buggy. Optimus has bugs, Intel's official drivers have bugs (and some are really annoying in practice). I do believe they will put it to use in some GPGPU applications but unfortunately I just doubt it will be even remotely competitive.
    thebeez
    • Evidence?

      "To all the disbelievers, the technology actually works pretty good, as evidenced by the recent PS4 announcement"

      That's not evidence, considering that the PS4 isn't even out yet. I have high hopes for the platform, but I still consider it a wild card. Once it's in the hands of devs and players, we'll get more information about its true performance, but until then I don't consider it to be evidence.
      CobraA1
    • AMD on PS4s, is probably more a decision about economics, than performance,

      because, as we are all aware, the AMD cpus are generally cheaper (not cheap tech, just cheaper to own).

      Neither Sony nor Microsoft want to price their newer gaming devices out of the range that most people can afford, so, cheaper AMD cpus equals more affordability to the general gamer.
      adornoe
  • Sorry but Iris is just marketing bluff.. i.e. Larrabee, Itanium, etc. massi

    The hUMA architecture from AMD is a check mate to Intel. AMD's heterogenous unified memory architecture finally creates the ultimate processing chip. One that fuses the CPU and the GPU and permits it to use the same address spaces and memory caches. People are not aware of how difficult is to develop what AMD has accomplished now. People compared Intel CPU's to work in progress AMD's APU's. Now Intel will have to deal with the real finalized product from AMD, and in the future, with the server and workstations versions of it.

    Intel lost the high performance GPU race when they fell to build Larrabbee. Haswell are toys that will not compare to NVIDIAs apu type 64 bit ARM cores and the upcoming hUMA AMD's 64 bit x86 and ARM cores with fused design architectures. Intel machines without an NVIDIA or AMD discrete GPU are practically browsing, word-processing, low-end gaming machines. Investors should ask their kids about GPU's because they could benefit from their experience and avoid making investing mistakes.

    After seeing todays Intel's Haswell announcement, there is no doubt in my mind that Intel is going to economically crash for the next couple of years. It cannot sell overpriced CPU's anymore and is losing it's PC market big time. The last straw will be the server business and the workstation business which start crumbling next year.

    It's the typical case of the bully that lost the baseball bat. People are tired of paying way high prices just because Intel bullied it's own customers and it's competition and force high prices to it's customers throats. Without that money coming from overpriced schemes that nobody would pay if they had a choice, Intel profits will sync fast.

    If NVIDIA buys AMD, Intel dissappears since NVIDIA will not need Intel but Intel depends on NVIDIA and AMD for high performance GPU's. NVIDIA will had ARM and x86 technologies.

    The only hope Intel has to avoid years of deep restructuring is to find a new working business & business model or for Intel to buy AMD. The alliance of AMD and ARM to develop 64-bit ARM processors is a nail in the coffin for Intel since AMD has an incredible IP and any AMD ARM core will not have the bottlenecks that the x86 architecture currently have. AMD knows exactly what are the x86 bottlenecks.

    Intel is in danger of becoming irrelevant in the next couple of years. Apple, Qualcom, NVIDIA, Texas Instruments, Samsung, AMD, etc. all of them will have high performance 64-bit ARM HCA/hUMA APU's and all of them are powerful and successful companies that have been abused by Intel practices. Intel will be the only x86 CPU, besides AMD, which will have both ARM and x86 cores and will profit from both.

    Even Apple will eventually stop buying x86 processors from Intel, since Apple designs their own ARM processors and are developing high-performance 64 bit ARM processors.

    Samsung will not need Intel since it has it's own ARM processors and have excellent chip foundries, Sony has designed a custom APU with AMD, NVIDIA can sell it's own 64-bit ARM processors and GPU's, and all the other companies are already in mobile, tablets, and laptops. Next year they will be on servers, workstations, etc.

    Intel needs to buy NVIDIA or AMD to survive without a deep crash. It never conquered the high performance GPU and the time has run out. Burning money to catch up will soon be apparent and not being able to overprice it's products is a profit killer.

    Sorry but Iris is just marketing bluff.. i.e. Larrabee, Itanium, etc. massive marketing with zero punch.
    aczdnet0
    • It might.

      "The hUMA architecture from AMD is a check mate to Intel."

      It might. I consider it to be new, unproven tech right now, but it might.

      We'll see what happens when it's out in the real world and the performance numbers come in.

      I wouldn't be surprised if Intel starts designing their own variant of hUMA if it proves to be a success.
      CobraA1
  • look here for full info

    http://www.thinkdigit.com/Parts-Peripherals/Intel-announces-Iris-and-Iris-Pro-graphics_14490.html
    sarai1313
  • INVENTORS - DO NOT TRUST INTEL!!!

    INVENTORS - DO NOT TRUST INTEL
    I invented a CPU cooler - 3 times better than previous best - better than water. Intel have major CPU cooling problems - "Intel's microprocessors were generating so much heat that they were melting" (iht.com) - try to talk to them - they send my communications to my competitor & will not talk to me.

    Winners of major 'Corporate Social Responsibility' awardS!!!

    Huh!!!! When did RICO get repealed?"

    INVENTORS - DO NOT TRUST INTEL!!!

    BTW, I have the evidence - my competitor gave it to me.

    BBTW, I am prepared to apologize to Intel if;
    • They can show that the actions were those of a single individual in the company, acting outside corporate policy, and:

    • They gain redress on my behalf.

    Inventors - help your fellow inventors - share your experiences with companies - good and bad.
    Stuart211
  • 50% faster

    that's a pretty big statement but it would be nice..I have always used AMD processors for one reason..they are more reliable and fail less and cheaper.
    dnationsr
    • Funny, my experience has been different.

      And so has the experience of the IT world in general. Most people I know who have AMD systems tend to replace or upgrade them every few years, and everyone I know who has had a major hardware failure involving the CPU has been using AMD. Meanwhile, I have a 10-year-old Pentium 4 system that still works as well as it did when it was new.

      The 'cheaper' aspect is a very good reason to go AMD -- especially if you're the kind of person to upgrade every few years. Their high-end processors are plenty fast for anything most people want to do, and in gaming you tend to be more limited by what your GPU can do than what your CPU can do.
      Jacob VanWagoner
  • GHz Gorilla

    There's a many GHz Gorilla in the room that Intel [purposely] overlooks: RAM. Integrated GPUs use [steal] "shared" memory, which is rarely the same amount and never the same speed as the separate VRAM used by discrete video cards.
    ChasmoeBrown
    • Hence, Iris

      Actually, Iris and the new cache design, Crystalwell. Crystalwell basically dedicates part of the cache to the GPU but allows the CPU to load it up quickly. Iris Pro actually includes some graphics-dedicated DRAM on the package with the processor.
      Jacob VanWagoner