Wedding bells for Intel and Nvidia?

Wedding bells for Intel and Nvidia?

Summary: Forbes.com's Mary Crane: Nvidia shares soared on Wednesday following rumors of a possible takeover by Intel.

SHARE:
TOPICS: Intel
13

Forbes.com's Mary Crane:

Nvidia shares soared on Wednesday following rumors of a possible takeover by Intel.

More than 22 million Nvidia (nasdaq: NVDA - news - people ) shares changed hands Wednesday — more than twice the three-month average daily volume — on news reports indicating Intel (nasdaq: INTC - news - people ) might buy the graphics chipmaker.

Since July, when Nvidia rival ATI Technologies (nasdaq: ATYT - news - people ) was acquired by Advanced Micro Devices (nyse: AMD - news - people ) for $5.4 billion, investors have speculated that Nvidia might be acquired by Intel.

Neither Nvidia nor Intel would comment on the market’s speculation.

It's going to happen.Computers as we know them have been and will continue to give way to what the geeks call embedded systems or "appliances" but what consumers call TiVo  (just one example).The winners in that market will be the ones who can provide innovative appliance makers with the best ingredients. What are the best ingredients? The kind that minimize the amount of system level customization and R&D that must be done (by appliance makers). The kind that offer world-class peformance. The kind that are bottom-line friendly.

For example, take another very relevant-to-this-news appliance example: xBox (or any of its competitors). These gaming systems just scratch the surface of the sorts of specialized compute-enabled appliances that are going to become a part of our everyday lives. Sure. Maybe the competition between the major gaming players justifies the R&D that goes into a solution that's custom-built from head to toe.

But there's an entire marketplace of appliance innovators to whom eeking out 5 more nanoseconds doesn't matter and they need some platforms that make it 1000 times easier and cheaper to bring their wares to market. Many of those platforms will need some standard componentry coming out of the gate (computer power, video, etc.). But flexibility will be important as well. Rather than soldering everything together and tying innovators in, give them some room to build stuff we haven't dreamed of. AMD's Torrenza architecture clearly paves the path towards that future and AMD's acquistion of ATI (Nvidia's archrival) makes sense not only because ecosystems like Torrenza need to be seeded with working "clients," but because of how, more often than not, video will be an important ingredient to appliance innovators. 

EMC's Mark Lewis is absolutely right. Forget PCTVs (where TV technology is integrated into the PC). The other way around is the direction that computing is going to take (just look at Apple's forthcoming iTV) and given that, I don't think Intel has a choice at this point. Intel's Centrino and, more recently its Viiv brand (more like PCTV than TVPC), are perfect examples of how the company is already trying to do the lion's share of the technical heavy lifting (pre-bundled technology) so its OEMs don't have to. One critical key to Centrino's success has been Intel's ownership of the networking piece (Centrino uses Intel's WiFi radios). OEMs (mostly notebook manufacturers) know it will work out of the box so they can focus on things that will differentiate whatever it is they bring to market. So, Intel already has that state-of-mind. Not to mention some additional revenue since it gets to profit from the sales of its WiFi radios too.

Sure. Intel could depend on the Apples of the world to figure out how to do the integration themselves. But that simply isn't a scalable business model and if anybody understands scalable business models, it's Intel. And, I highly doubt that Intel is going to sit around and watch AMD errect an ecosystem around Torrenza that gets all those appliance innovators (the ones without the resources of an Apple) 95 percent of the way home.

On the day that AMD announced its acquistion of ATI, my first question/comment was: Is Nvidia next? It's right there in the headline. I've never had my doubts.

For you investors out there that missed yesterday's surge, the question now becomes, who is next? When appliance OEMs are buying pre-bundled platforms, what beyond compute-power (microprocessor/memory), video (ATI/Nvidia), and networking (Intel already has it), will they expect to be included in the bundle? Somewhere, on some confidential PowerPoint presentations at AMD and Intel are three bullet points: the definites, the maybes, and no-gos. At least in AMD's case, that's where the genius of Torrenza co-processor-esque open socket architecture kicks in. Just because some bundle candidate may be on AMD's no-go list (today), doesn't mean another OEM can't come along and redo the bundle (perhaps to satisfy some niche appliance market that's too small for AMD to go after).  

Topic: Intel

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

13 comments
Log in or register to join the discussion
  • Or...

    ... onboard video (and audio) were of insufficient quality, and purchasing technology can be cheaper than developing it.

    If the consumer can obtain acceptable video without purchasing a card, that's a sales point for motherboards and chips.

    I think Occam's razor applies here. If a simple competitve explanation suffices, why speculate about a major change in an established market?
    Anton Philidor
    • Hey, I was just saying we're missing Anton

      Nice to see you back Anton.

      I do want to note that the latest on-board video chipsets from Intel is good enough to run Aero glass. It's probably not great for gaming, but it's good enough for most non-gamers.
      georgeou
      • Thank you.

        I've been in the vicinity, but don't usually post on security or license issues or running software virtually.
        (How often can I say, security problems are universal, license provisions are push-pull, and virtual copies are not necessary except when they are.(?))


        On your comment here, though games are a specialized concern, the functionality required for them is being used more widely. Seen some of the recent screensavers?

        Microsoft probably thought of Aero as middle-of-the-road if not minimal compared to current and future video demands.

        So the quality of on-board video can probably be expected to become a sales point.
        Anton Philidor
  • Intel Copycat?

    Is it just me or is Intel just being a copy cat. AMD Buys ATI and now Intel trying not to be outdone wants to buy Nvidia. It seems Intel has been doing a lot of copying off AMD lately. AMD releases x86 64bit technology (At first Intel said that was not necessary and was pushing the Itanium) and then it becomes popular and works efficiently so they slap in their own EMT64 into the pentium 4. And now AMD buys a big graphics company because they feel it will help them make better chips and Intel wants to do the same. Its kind of like keeping up with the Jone's with Intel.
    bobiroc
    • Actually, it is the other way around!

      AMD copied Intel. Intel bought a video company years ago which brought about their integrated video. Everything that AMD has done on X86 uses extensions of the x86 architecture that Intel developed. Don't look now bought your AMD bias is showing!
      ShadeTree
      • Last few years

        I was juet referencing the last few years.

        AMD = 1st to bring X86 64bit to mass public
        AMD = 1st to bring Dual Core Technology to Mass Public
        AMD = 1st to buy High Powered Graphics company

        Yeah AMD uses the X86 that intel Developed (actually I think it was co-developed by both Companies and AMD has the license to use it) and intel bought a graphic card company that they have done very little with besides bring basic video (which is what most people need though, but still)

        I guess I just found it funny that AMD has been a great innovator over the past few years and Intel follows suit and gets much of the praise and credit. You can call that Bias if you wish, but I call it seeing the truth.
        bobiroc
        • We'll see what happens.

          Intel = first to bring Quad Core to mass public (end of this year?).

          You can't say thats not innovating, and you can't say that only AMD is doing the innovating. Sure, they have done some great stuff recently, where Intel has slipped somewhat.

          Both AMD and Intel are having their turns at showing us what they can do, and its interesting to watch. Out of it we get better, cheaper products as a result.

          AMD having bought ATI will no doubt benefit us as consumers, with cheaper integrated devices. If Intel buys Nvidia, it can only mean better for us again, as price wars wage over integrated devices.

          x86 was not co-developed.
          Intel, AMD, Motorola and others competed for the technology that was to be used in the IBM PC.
          When IBM signed on Intel as the manufacturer of the chips for the IBM PC, they required that Intel licence a second manufacturer. Intel chose AMD as their second manufacturer. The story gets complicated from there.
          Azriphale
          • Interesting indeed

            [i]"You can't say thats not innovating, and you can't say that only AMD is doing the innovating. ..."[/i]

            I didn't say AMD was ONLY doing the innovating. Yes Intel may be first with the quad core and only time will tell if their FSB bottlenecks inhibit the processor.

            [i]"AMD having bought ATI will no doubt benefit us as consumers, with cheaper integrated devices. If Intel buys Nvidia, it can only mean better for us again, as price wars wage over integrated devices."[/i]

            Intel buying Nvidia kind of scares me because I am thinking will pull Nvidia chipsets from AMD based motherboards which would suck.

            [i]"x86 was not co-developed.
            Intel, AMD, Motorola and others competed for the technology that was to be used in the IBM PC.
            When IBM signed on Intel as the manufacturer of the chips for the IBM PC, they required that Intel licence a second manufacturer. Intel chose AMD as their second manufacturer. The story gets complicated from there."[/i]

            Thanks for the correction and you are right. It is very complicated indeed. But AMD did not "Copy" intel on X86 they were licensed to manufactur chips identical to Intel. That is why the Intel 286, 386, and 486 chips were identical to AMD's. It wasn't until Intel put a trademark on their processor with the pentium that AMD had to change things up a bit. But using X86 is not because they were copying Intel it is because they had to.

            But I stick to my previous statements and Intel spent so many years selling a product that was sub-par in my eyes and making consumers believe it was better just because it had a higher clock speed and had the Intel logo on it. Believe me I have seen that first hand when customers came in and chose the Intel just because the mhz was higher. I think that was done on purpose despite what many people say. Now Intel is finally back in the game and I am glad because that will force AMD to re-think thier strategies and mix it up a bit. Because as you said we get better and cheaper products out of competition.
            bobiroc
          • Nvidia nforce

            I agree that Nvidia pulling the nforce for AMD will be a really bad thing (despite the fact that i use Intel, which you could probably tell :). On the bright side, if being bought by Intel meant that they would have to pull their AMD products, I doubt Nvidia would sign the deal. They have more sense than that. And I hope Intel does too.

            And as you say, the NetBurst (Pentium 4) architecture sucks. I'm finding that as we speak. My laptop (Pentium-M, 1.86GHz) running on battery compiles stuff faster than my 2.8GHz P4 :) maybe when AMD comes out with their quad-core, i'll upgrade to the Intel or AMD chip that gets my interest at the time. Can't wait for that.
            Azriphale
  • How about these two...?

    Perhaps Ageia with their PhysX processor technology looks like a possibility.

    How long can Creative Labs on the other hand continue to sell add in sounds cards with integrated sound getting so much better.
    tonyman262
    • Creative Integrated Sound

      [i]"How long can Creative Labs on the other hand continue to sell add in sounds cards with integrated sound getting so much better."[/i]

      Creative now has high quality sound chipsets used in desktops and notebooks. I have seen them in Dell Computers and I am sure they are being used elseware. Of course the Add-In cards are still useful for the enthusiast that like to do High Quality audio editing and capture.
      bobiroc
    • Integrated sound better, but is a LONG way from EAX.

      yes, they have 5.1 surround, and they do it mostly all through software. The Creative chips do it through hardware, so less load on the CPU's. Also, 99% of Games games support EAX, not 5.1 surround. Don't underestimate the realism that EAX adds to a game. There's far more to it than just positional audio. Occlusion is huge. It has the effect of making it sound like somone coming up behind you moves behind a large stack of boxes in the middle of a floor, and the sound sort of fades out as they move past it. simple 5.1 (or 7.1) surround just does not do that. Real 3D sound is more than just position. Sadly, Creative cards or their drivers seem to be somewhat flakey. Especially the X-Fi series. (just look on their own forums about complaints in this area.)
      A. Noid
  • Intel nVidia in the works?

    when you consider that Intel had made a large investment into the ATI chipset to be integrated and validated in a line of low cost motherboards, the necessity of dropping the line with ATI chipset became real after AMD purchased ATI.

    So would nVidia be a good way to go to obtain a company that not only produces competitive GPUs but also has the hottest chipsets for integration.

    If I was Intel I would seriously consider it. To use ATI in their motherboards would be counterproductive, and nVidia is a perfect choice.
    magpie_z