Photo Gallery: AMD claims its ATI R600 is primed, ready to beat NVidia's 8800

Photo Gallery: AMD claims its ATI R600 is primed, ready to beat NVidia's 8800

Summary: Among the various agenda items at today's AMD press conference was a report card on it's plans for CPU/GPU integration -- a part of the company's long term vision now that it has acquired graphics tech titan ATI. During the conference, company spokespeople were very frank about the lead in graphics performance that NVidia's 8800 graphic solutions have over anything that ATI has to offer.

TOPICS: Processors

Among the various agenda items at today's AMD press conference was a report card on it's plans for CPU/GPU integration -- a part of the company's long term vision now that it has acquired graphics tech titan ATI. During the conference, company spokespeople were very frank about the lead in graphics performance that NVidia's 8800 graphic solutions have over anything that ATI has to offer. That said, company officials also argue that the true benefits of the 8800 won't kick in until software titles supporting Microsoft's DirectX 10 graphics APIs (aka "DX10") hit the market. Currently there are none. But by the time those software titles start to ship, ATI claims that its competition to NVidia's 8800, currently codenamed the "R600," will be ready by the time the DX10 titles that need the sort of power that it and NVidia's GeForce 8800 deliver.

 Photo Gallery:
I snapped a few pictures of AMD's R600 prototypes. See the gallery.

Benchmarks by those who have been fortunate enough to get their hands on an early prototype apparently show the R600 besting "the solo 8800 GTX in basically every benchmark."

Prior to the press conference however, while I was told that we'd get see the R600 technology in action (inside some black box systems that were on site), we would not be given a peek at the actual PCI Express prototypes themselves. But there I was standing there with the press conference about to get underway when I looked down to my left and what did I see? As best as I could tell, they were the R600 prototypes, or some variant thereof.

Fortunately, I had my Nikon D70 dSLR out and I snapped a bunch of photos. Then, during the press conference, executive VP of AMD's Visual and Media Businesses Dave Orton (who came to AMD through the company's acquisition of ATI) said that he'd give us a peek at the boards as well as the insides of the demo boxes. But, for some reason, that phase of the press conference never happened (I think everyone simply forgot). So, I asked if they wouldn't mind if I posted the pictures that I snapped just prior to the beginning of the event. AMD agreed on the ground that I'd refer to the unit as a prototype. One of the pictures I took appears below an the rest are in a screen gallery that I just posted. Unfortunately, between the protective cover and the heat sink, there isn't a whole lot to see.

Topic: Processors

Kick off your day with ZDNet's daily email newsletter. It's the freshest tech news and opinion, served hot. Get it.


Log in or register to join the discussion
  • level505 is not credible

    The benchmarks you refer to that show r600 beating the 8800 across the board have been fairly universally accepted as uncredible by the enthusiast community. They were disclosed by a site ( that was created just prior to posting the benchmarks and which posseses no other content whatsoever. They (level505) also failed to respond to very resonable requests for info/pics that would have backed up their story.

    Not that R600 won't beat the 8800 (but how long will it be out before 8900?) but level505's benchies alone can't be taked as evidence of R600's superiority.
  • At least they have something in the works

    I for one will enjoy seeing what ATI/AMD comes up with, with any luck it should help drive the price of DirecX 10 cards down to where everyone interested can afford them. If ATI can get one to the market for less then the $650 Nvida cards came out at I'd probably give it a try.
  • no DX10? thats so 2006!

    What, there are no software titles running under directX10? Supremem Commander was released on February16th!
    I guess this means AMD/ATI are even further behind....
    • dx10 games


      yes, Supreme Commander shipped but without DX10 support, which is to be released sometime in the future.. I don't know of any dx10 game on market at this time.
  • ATI/AMD chipsets

    Personally, I am waiting to see how their chipsets turn out. I know that ATI's southbridge chipset was slow compared to nVidia's, but they were very stable, something for which I prefer over speed. I wonder if AMD will continue ATI's chipset support of Intel processes.
  • Ati VS Nvidia

    I love reading the stories on these cards. Everyone knows that the 8800s did come out way before the R600 will hit the market.
    The interesting thing is that everytime ATI releases a new card it does exceed current graphic levels. We all remember the differences between the 7950 and X1950. But at the level these cards run at we all need to remember one thing is that our eyes cant see the difference of 100 fps or 200 fps. We can only really see the difference if they fall below 30 fps. But it is six of one and a half of dozen of another.
    But Nvidia is following in ATI footsteps and working and making SLI work across more than 2 pci express slots. ATI's crossfire uses one, two, or three or four slots.
    But everyone who thinks the prices will drop are correct. Nvidia will be older and they will lower prices and then they will release the 8900. It never ends...just as FYI I still run a x800 and have no problems with new games. So take all this with a grain of salt.
    • We can see above 30 FPS

      It's a common misconception that we only see ~30 FPS since movies are run around that speed, but computer games and movies aren't the same:

      Also, if you scroll towards the bottom of this page:

      You will see a link labeled 'FPS Compare.' It's a small program that will run a 3D scene split in half with one side running at 30 FPS and the other at 60 FPS. I suggest hitting F2 to change it to the outdoor scene which is much better at showing off the differences IMO.
      • Nope, not 30 FPS

        People can see above 30 FPS. They can't see above 60 FPS (or there abouts). Film is shot at 24 frames per second, but each frame is shown twice (so you see film at 48 frames per second). Some people still perceive flicker, but above 60 frames per second, no one can perceive flicker.
        • 30 fps vs. 24 fps, etc.

          A couple of things to note about these numbers...

          First, most cameras today are capable of 30 fps. The comment about film being done at 24 fps is correct. However, that choice is due to the cinematic quality of 24 fps over 30 fps. That "flicker," as you say, is what introduces that cinematic quality. So, film (movie -style) is done at 24 fps. TV video is often done at 30fps.

          My understanding (in other words, this may not be 100 percent accurate, but I think the basic idea is right) is that the choice to go 30 fps is more tied to our electrical grid than anything else. The socket in the wall alternates current a 60 Hz and with each alternation of the current (60 alternations per second, but 30 complete +/- sets). TVs plugged into our wall sockets therefore work off the same electrical frequency.

          If I remember correctly, with each alternation of the current, you get half a video frame (for example, all the odd numberd lines in interlaced video). Two alternations get you all the odd and even numbered lines (aka a complete frame). 60 alternations divided by 2 passes to complete each frame yields 30 fps (the North American NTSC standard).

          Again, I'm not sure if I have this right, but, the PAL alternative to NTSC which is used in Europe is equally tied to the alternating current standard in Europe which is 50 Hz, the result of which yields 25 fps -- a frame rate at which flicker is perceivable to the human eye. I thought it was in Europe where they looked to double the frame rate to 100 fps to remove the flicker. Did that actually happen hear in the US too?

          • Around 21-22 fps is where Persistence of Vision takes over...

            ...and the mind smooths it all out. Way back, the film industry standardized the frame rate, with that little extra bump to 24 f.p.s. You shouldn't see any flicker. Perhaps what you perceive as "flicker", is actually the use of a high-speed shutter, which, unlike the human eye, takes razor-sharp snippets. Certain fast-moving subjects traveling across the frame appear to flicker, but it's actually a stroboscopic effect. Maybe newer generation human eyeballs can perceive faster than obsolete, irrelevant dinosaurs such as myself.
            Feldwebel Wolfenstool
          • 24 Fames doesn't cut it without Motion Blur

            The 24 frames per second you see in film only seems smooth because of motion blur. In games, each frame is still visible because there is no blur between one frame and the next, to connect the two positions of the object for your eye. In games (unless they are implementing a motion blur as some newer games are) you ALWAYS get a razor sharp image for each frame.

            That said, 20 to 30 fps is perfectly fine for most any game. The target most people seem to shoot for is to get their game running at 60fps.
  • How many R600 Shipped?

    Anybody knows how mant R600 shipped? What are the monthly shipments?
    Same question for Nvidia GEForce 8800
    • GeForce 8800 units shipped

      According to Jensen Huang, Nvidia's CEO, they are shipping about 100,000 GeForce 8800/ month. See here: