In marriage of 'CPUs and GPUs,' ATI snapped-up by AMD. Is NVidia next?

In marriage of 'CPUs and GPUs,' ATI snapped-up by AMD. Is NVidia next?

Summary: In another one of the worst kept secrets in the technology industry, AMD has shelled out $5.4B for Canada-based video and graphics solution provider ATI Technologies.

SHARE:
TOPICS: Processors
40

In another one of the worst kept secrets in the technology industry, AMD has shelled out $5.4B for Canada-based video and graphics solution provider ATI Technologies.  According to the aforelinked Reuters news story:

Talk of a tie-up between the two companies first emerged in May. Over the weekend, the rumors intensified until it was almost considered a done deal on Sunday...Many industry analysts have said it made little financial or strategic sense for AMD to buy ATI outright. But AMD, the No. 2 supplier of processors, said it will use the purchase of Canada-based ATI to expand its product mix and its market share as it battles No. 1 Intel.

This morning, in a before-the-bell, in a press conference giving by the two company's executives, ATI president David Orton referred to the deal as a marriage of CPUs (central processing units from AMD) and GPUs (graphics processing units from ATI). I recorded the entire conference as a podcast. It can be downloaded, played back using the streaming player at the top of this blog, or, if you're subscribed to ZDNet's IT Matters series of podcasts, it'll be downloaded to your system or MP3 player automatically (see ZDNet’s podcasts: How to tune in).

The conference was kicked off by AMD CEO Hector Ruiz who said (excerpts):

Together, we Intend to create a processing powerhouse ... [Our customers have] been asking us to do this for them.  The ATI team will play a key role in defining our feature.  ...  we are confident that the companies and cultures will integrate well together.  With this transaction, we will move from being neighbors [on the motherboard] to being family. ....

During the call, AMD president Dirk Meyer and ATI president Dave Orton confirmed that the companies see a future where graphics technologies are integrated into the microprocessor silicon much the same way that AMD's approaches to component design have on several occasions pressured Intel to rethink its business. AMD already integrates memory controller technology into the same dies as its CPUs.  Some of the repeated themes heard throughout the conference were growth, innovation, choice, and something the executives routinely referred to as "customer-centric platforms" that the executives expect the merged company to begin delivering as early as 2007.  What are "customer-centric platforms?"  As best as can be told from the conference, it refers to a variety of CPU/GPU-integrated solutions, each of which is optimized to service specific applications in the enterprise and personal computing spaces, the mobile handset space, and in consumer electronics.

During the Q&A part of the conference, I asked the executives if this foray into specialized computing solutions across those segments means that we'll be seeing a diminished role for the PC over the long haul and whether this move was positioning the company for that reality.  They downplayed the idea that the PC's role would be diminished and instead spoke of better PCs for the applications that need them like multimedia.  But, given the two companies' ambitions in the mobile market where there are already nearly three times as many handsets than there are PCs, I'm not so sure.  With every day, the generalized PC is increasingly becoming a relic, giving way to devices like dedicated mobile (and connected) gaming consoles as well as the Motorola Q and RIM's BlackBerries that seem every bit as powerful as the PCs that were in the marketplace just a few years ago.

According to AMD president Dirk Meyer, as the two companies look to integrating CPU and GPU onto the same dye, that integration will not initially be necessary for all of the markets that the two companies currently serve.  Meyer said that he sees opportunities to leverage the techniques used by both companies to accelerate workloads on CPUs and GPUs service specific applications.  Referring to Intel, the executives were optimistic that this is another move that will in their words "break the monopoly." 

How Intel will respond is of course a major question. For example, it could make a play for ATI competitor NVIDIA.  ATI and NVIDIA are head-to-head competitors, particularly in the hotly contested gaming market where graphics performance is essential to a smooth realistic experience.  Although the company participates in other technology segments that end up on the motherboards of PCs -- networking for example --- Intel has for the most part resisted any temptation to integrate technologies such as memory and graphics controllers into the same silicon as its processors.  But AMD's approaches to component design have on several occasions pressured Intel to rethink its business.  

Among the many questions asked by analysts during the Q&A part of the conference were ones that drew into question the future of ATI's chipset business, particularly given the widespread usage of it with Intel's platform.  AMD's Ruiz responded that the company is well aware of how this deal puts that business at risk but reiterated that as long as Intel's customers demonstrate a preference for the ATI chipset that AMD would be happy to provide it.

Last week, in my podcast interview of AMD's director of commercial solutions Margaret Lewis, you can hear me asking about AMD's rumored buyout of ATI.  Said Lewis in that interview:

AMD is always looking at "Do we need to acquire this technology ourselves or do we need to partner for that technology. Everyplace we make these decisions, there are many thought processes that we have to put. If we acquire company X and bring that technology in house, then what does it do to the partner community out there... From an AMD standpoint, we need to ensure that the technology that we put in our processor can be exposed to those software solutions that my title loves so much (commercial software solutions). So, we always have to [ask] "Do we have to bring this technology in house as a way for us to make sure that we deliver to our customers the functionality that we want? Can we do that through partnering?" So, you'll probably look at AMD over the next couple of years and having to decide is it in-house or is it partnering?... We'll have to make a business decision.  However, I have to admit it gives a lot of fodder to the news industry so they can speculate.

Well, the speculation is now over.  At least when it comes to ATI.  Now, perhaps its time to speculate about NVIDIA.

Topic: Processors

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.

Talkback

40 comments
Log in or register to join the discussion
  • my vote: the goal is large-volume cost reductions

    In the consumer/business/laptop market this is going to be a yawner. Hardly anyone will care what chipset drives the monitor for home and business desktop applications. It's on the same level as the audio and ethernet jacks.

    But there is a market segment that will probably be worried about this - people who purchase high-end systems for graphics, audio, and gaming applications. Did the GPU market just drop from three suppliers to two? Will AMD continue to produce high-performance graphics cards under the ATI label? Will they poach CPU developers from the GPU side? Will they be able to stay on the cutting edge?

    I would like to see more comprehensive numbers about ATI's revenue in the consumer vs. high-end space, but my gut tells me that AMD is looking for cost reductions for on-board video - and the heck with the high-end market - and possibly even an opportunity to become a mainboard supplier like Intel.

    Hey, I don't suppose they've been sniffing around ASUS as well?
    GDF
    • I think you were right...

      ... when you wrote: "... AMD is looking for cost reductions for on-board video - and the heck with the high-end market - and possibly even an opportunity to become a mainboard supplier like Intel."

      Yes, the pc remains unchallenged at the center of computing, and AMD has to provide more to remain competitive. The usual situation for industry consolidations.

      But I wonder how many people will be satisfied with onboard video/audio?
      Anton Philidor
      • Onboard video...

        [b]But I wonder how many people will be satisfied with onboard video/audio?[/b]

        If you're into laptops, there's not much choice - you're kinda stuck with whatever incorporated audio/video they throw at you. As for PC's tho, it depends on the GPU and if there's an available AGP or PCIe port on the motherboard.

        For what it's worth, I've got a 3 yr old Athlon XP box with an ATI Radeon 9500 that's built in - and it works rather well. My new Athlon64 box, however, has an Nvidia GPU (GeForce 6100) built in. Both have a GPU slot on board - just in case. Both work rather nicely for what they do. I haven't really seen a need to buy an external vid card. Yet.
        Wolfie2K3
      • Onboard Graphics and Sound?

        I've not bought an external sound card for years - I would argue that the VAST majority of PC users (even those using MediaCenter) don't want, have or need surround sound. So, AMD could either build in a surround sound chipset and be done with it, or offer a lower priced offering including simple stereo.

        I would also argue that if AMD now puts a Redeon 1800/1600 class GPU on the CPU die and then provide an insanely fast memory bus, and possibly a shared-memory buffer a la XBox360, that the perf benefits will propel such AMD machines way beyond that of Intel.

        And when the Radeon 2600 class GPU evolves, I can just plug in the new card and have TWO independent GPU's.

        If they do this right, I can only see upside here.
        de-void-21165590650301806002836337787023
        • can anyone

          can anyone say space heater? GPUs run hot. look for freon cooling with your new AMD box. I agree with onboard sound and video. Unless you are a gamer they are good enough for business needs, and Intel has been producing them for a long time.
          cuberly@...
          • Yes, using AMD in business...

            ... is an important consideration. There it's unnecessary, usually, to have the best sound quality, and an open slot allows a solution when better sound is needed.

            Perhaps AMD has found that it cannot make all possible business sales without improved - and cheaper - onboard video and sound.
            Anton Philidor
  • smart?

    lets just hope the high end market is still favorable for computer manufacturers. I have a strong feeling that amd stock/market share will continue to climb, but dropping high end support would not be favorable for anyone. As a gamer myself, i have always liked Nvidia better anyways, but i can see nvidia blowing prices up considerable amounts if they are the only ones with the technology that we need to continue playing our graphic dependant games
    mominky@...
    • What?

      AMD's stock has dropped since the merger was announced. I don't expect AMD to stop making the ATI cards just to adapt some of the tech to their processors. But how much can they gain? High end graphics create a lot of heat and AMD is just getting rid of their reputation of building toasters. Their biggest advantage will be to rid Intel of a backup source of chipsets. But that will be cutting off their noses to spite their faces. Intel was spending 300 to 400 million a year with ATI. Think that will continue? Good luck.
      cuberly@...
  • good idea for mobiles, might make gamers mad

    I can see where this would be of interest to those making mobile
    devices. One small low power (compared to two chips) package
    that can run the multimedia apps well would be great for getting
    the form factor down in size. I don't see this as being well
    received in the gaming market though. The only way would be if
    the whole chip could be swapped out and the support chipset
    could be flashed to use the new abilities. Maybe going to a
    daughtercard setup to have the CPU/GPU & the support chipset
    as one mini board, then there would be the bandwidth concerns
    to hook it to the rest of the system.

    This would give the OS writers some new tools and extra power
    for some tasks. Think of using the GPU part for encryption.

    I remember a new chip being designed at the end of the Amiga
    days that would have one high speed and one medium speed
    graphics chip built in so the CPU/GPU could use the memory bus
    at very high speeds. I think it sounded like 'capriana' or
    something similar. This possible merging of the CPU & GPU and
    giving them access to a high speed universal memory bus would
    have been a nice touch but Commodore execs screwed up and
    the company died under odd circumstances.

    This sounds like it could have very interesting developments in
    the future. We'll have to see what comes of it all.
    Mr_Dave
    • Gamers will have to adapt, painful as it may be....

      You've some interesting ideas and suggextions that may be more appropriately aimed at the makers themselves. Their reception however probably depends on each makers long range plans.
      yogeee
  • Monolithic is the way to go!

    Combining CPU/s and GPU?s et al. onto a single monolithic wafer into a single monolithic processing chip has many obvious advantages including:

    - Lower chip production and end product costs.
    - Fewer pins and therefore a smaller footprint for the ?IPU (i.e. Integrated Processing Unit)?.
    - Concentration of thermals so heat can be dealt with efficiently.
    - Greater performance with less power consumption.


    Such chips may really move the stick and facilitate the development of:

    - Lower cost desktops, laptops and PDAs.
    - Handheld games with realistic graphics.
    - A handheld that offers true ?conversive computing? (i.e. takes accurate dictation), and eliminates the need to ?thumb key? every message into the device!
    - HDTV monitors with the ?media center? incorporated directly in a very consumer friendly way.

    Ja! I think that AMD has made a move that can once again give them an interim technology edge over Intel and really exploit the multi-processor wafer concept.
    jrpesq@...
  • Once again the PC is pronounced dead

    or at least dying. The tech press is obsessed with the idea and the AMD/ATI execs merely downplayed the suggestion but did not dismiss it outright. Tens of millions of computers, desktops and laptops, are sold every year and growth is continual, yet there are those in the industry who are ready to write off this very large segment of the business and move on to developing the NEXT BIG THING. This so we will be strongly urged to dump the old architecture of the pc and reinvest in their new technology, further enriching them beyond dreams of avarice.
    There will always be a very large place for personal computers, even in the form factor we have today. If companies like AMD/ATI think otherwise and abandon this market to persue making cell phone size computers that are limited in use compared to a standard pc, that will be their mistake. They aren't the only players in the market and they wouldn't be missed.
    mustangj36@...
    • Be careful what you ask for, you might one day get it.

      Silicon, organic substrates and synthetic polymers, OH MY!
      yogeee
    • Nothing last for ever

      Always is a very long time. In the context that the PC has only been around 25 years, what is the betting that it will still dominate in another 25 years? Famous for 15 minutes.

      Why would a cell phone sized computer be limited in use? Well if there was a use that that limited. Games perhaps? Buy a Playstation. Its not too difficult to see the drift to the net as the platform, broadband thin clients and such like. Sure it will take time - some people still use typewriters - but the fact is that its a bit surprising that none of the phone companies has built in an office suite a VGA out and USB port. A phone set up as a thin client terminal would be good enough to run our business and with a bit more bandwidth and foldable screen I could carry my office round in my pocket.
      INGOTIAN
  • Does this make business sense?

    I wonder about the price AMD paid. $5.4 billion is a lot of money. Can AMD handle the debt and still make the needed investments it will need in production facilities? These are more important questions than will a combined GPU/CPU be better. Also, what about the Mainboard manufacturers that are using AMD and ATI chips, but not in that specific combination? Will these companies stay with AMD or jump to Intel. Will the non-competitive advantage AMD had over Intel in this market (AMD does not make mainboards IIRC) be lost and thus the market share numbers change. These are the important questions. Tech is well and good to debate, but if AMD goes under because of a bad business deal at a bad time it is a mute point.
    hkeeter@...
    • Mobo's and SLI

      Aye, that's my question as well concerning mobo mfrs. I recently built an SLI based gaming rig (using an MSI K8N mobo) and am now wondering about support in the future. The last I looked ATI's Xfire doesn't use SLI, and I haven't seen an Intel based mobo supporting SLI either. With AMD buying ATI, will NVidia get in bed with Intel and come up with a mobo for SLI, or stay with AMD's boards and chipsets thereby leaving Intel to forage ahead on its own really w/o having a partnership with either big dog in the graphics world? It's going to be something to watch, that's for sure.
      Avatar30
  • Game Drive

    The dirty secret of the open architecture PC is the extent to
    which gaming has driven it's adoption. Dell is now producing re-
    grilled edsels with sparkly red lights for fanboys. Microsoft is
    tying their new iPod competitor to their only "successful"
    hardware offering, the XBox. Now AMD takes the red pill and
    purchases ATI for anything but a bargain. The gaming industry
    has ballooned and holds sway over a male ego that requires
    constant bragging rights.

    A generation of Doom fans seduced by the metaphoric and
    illustrated landscape are uncomfortable with the extent to which
    their passion is a waste of time and an addiction.

    Don't get me wrong, I wouldn't argue it should stop, It is
    probably keeping us from fragging for real. I would argue that it
    be acknowledged, tempered, and find a conspicuous place.
    Come out of the closet folks cause only then can the
    intervention begin.

    Hardware will continue to have little consequence and chips will
    leapfrog one another for generations. The larger problem
    continues to be how a smattering of hardware differentiation can
    distract us from a software monoculture. Microsoft, ATI, AMD,
    they all know which side their bread is buttered on. Give a dude
    something to shoot and he shows up with bells on. We are
    suckers that way. I think to this extent, we've been played. Does
    anyone think that there is much further growth left in the
    spreadsheet industry? Is this what is going to drive upgrades?

    This is the nicotine in cigarettes, the caffeine in coffee. It's the
    sparkle of a slot machine. We should have a special place
    reserved for companies that go out of their way to use essential
    human fallibilities as a fulcrum.
    Harry Bardal
  • Well, there goes ATI

    With ATI owning it, ALL R&D was focused on better graphics systems, now it's just one more piece of AMD and will lag behind. sigh...
    No_Ax_to_Grind
    • same feelings

      I feel the same on this subject... With ATI coming out tied to AMD chipsets, buyers of Intel might not have a choice to make but go for NVidia. In this respect, Intel may not even need to acquire or partner with NVidia. NVidiva may become a de facto standard for Intel systems, where ATI could be for AMD.

      I say this is very unfortunate.
      aetherjoy
    • ATI was not purchased...

      ... without the expectation of continued video card sales on Intel. Would be difficult to fund the cost of ATI from gains in AMD sales only, Intel-related revenues disappearing.

      So there will be work on improving the graphics systems. It'll probably be slower because top staff cannot work on too many projects simultaneously, but it will continue.
      Anton Philidor