X
Tech

NVIDIA Fermi GF100 GPUs - Too little, too late, too hot, and too expensive

Hardware enthusiasts have been eagerly awaiting NVIDIA's latest Fermi GF100 GPUs. But early benchmarks suggest that NVIDIA's newest architecture is too little, too late, too hot, and too expensive.
Written by Adrian Kingsley-Hughes, Senior Contributing Editor

Hardware enthusiasts have been eagerly awaiting NVIDIA's latest Fermi GF100 GPUs. But early benchmarks suggest that NVIDIA's newest architecture is too little, too late, too hot, and too expensive.

NVIDIA officially unveiled the Fermi line back in September of last year, and this got hardcore hardware fanatics excited. Why? Because it's the first major architecture shift we've seen from NVIDIA since the G92 cores of 8000 series GeForce GPUs of 2006. This G92 core later became the cornerstone of the GeForce 9000 series, and then later the GTX 200 and GTX 300 series cards.

fermi1032910.png

So, why is the Fermi GF100 GPU a disappointment? Well, I haven't had time to conduct a full benchmark of this latest GPU, but my early findings match those of the guys from PC Pro - yes, Fermi is the fastest GPU that you probably won't be able to buy for a few weeks, but that its lead over ATI's HD 5870 and HD 5850, cards which are cooler, quieter, and significantly cheaper:

We ran benchmarks in a variety of current titles and, on the whole, the Fermi cards narrowly outperformed their ATI equivalents. In Crysis at 1,920 x 1,200 and Very High settings, the GTX 480 averaged 40fps to the HD 5870's 38fps; the GTX 470 scored 33fps to the HD 5850's 32fps. Higher settings saw similar margins. World in Conflict had the two Nvidia cards consistently ahead by just under 20%, and in Stalker: Call of Pripyat that margin was around 5%. Other games had ATI's cards ahead by a whisker, and if we average all the results, Nvidia's edge looks to be between 5% and 10%.

And that's not all. First, these Fermi cards suck at the teat of your PSU ferociously, with a GXT 480-based test rig sucking upwards of 400W when stressed, compared to around 270W for ATI's fastest single-GPU card. All this power causes a secondary problem - heat. The reports that the GF100 GPU can hit 98°C/208°F. I seriously have concerns as to how long a GPU pushed to this sort of level can last. And while the GPU is working this hard, you have to put up with annoying racket of the fan going flat out.

While being the proud owner of the "world's fastest GPU" is always a short-lived thrill, I feel that it'll be a shorter than usual thrill for Fermi owners. That 10% edge that the GPU has over ATI isn't going to last long.

My advice ... keep hold of your money and wait and see.

Editorial standards