X
Tech

Will the GPU become the new CPU?

Will the GPU ever become the metric that we use to measure PCs with, replacing the GHz-centric and core-centric CPU?
Written by Adrian Kingsley-Hughes, Senior Contributing Editor

Will the GPU ever become the metric that we use to measure PCs with, replacing the GHz-centric and core-centric CPU?

Why might this happen? Well, because increasingly software developers are looking to the GPU to take the load off the CPU. And with good reason, as the GPU absolutely excels at certain tasks, leaving even the most cutting-edge CPUs in their smoke. For example, just look at what NVIDIA is doing with CUDA.

It's interesting to see the GPU gain increasing relevance all of a sudden. Over the past couple of years we've seen the importance of high-end GPUs diminished greatly as games (which are increasingly developed for consoles) no longer suck at the GPU anywhere near as hard as they once did. When once you could spend $500 on a graphics card and still feel like getting another in order to hook them together in a Crossfire/SLI setup, now a $100 is really all that 95% of gamers need. Problem is, GPU makers don't make much of a profit off of $100 graphics cards.

Using a GPU as a secondary CPU (also known as GPGPU - General-purpose computing on graphics processing units) is seen by GPU vendors as a way of making the GPU relevant once again, and given the amount of computing power locked away in that tiny bit of silicon, it has a lot of potential. However, one thing that it does need is developer support, ad that seems to be coming.

There are already a decent number of applications out there that can leverage the GPU. Mostly they are video related, but the breadth and range of applications that can leverage the GPU is increasingly weekly. The other day we heard how Microsoft will leverage the GPU in Internet Explorer 9 and yesterday that Mozilla is working on the same thing for Firefox (both using Direct2D).

CPUs seem to now be at a point where for the majority of applications, adding more GHz or more core doesn't scale all that well. Sure, there's an overall uptick in performance with respect to more cores of more speed, but it's no way linear. Also, high end processors are very, very expensive. The power to price ratio offered by today's GPUs makes leveraging them in tasks not related to graphics a no-brainer. Intel recognizes this and plans to leverage it with the "Larrabee" CPU+GPU on the same silicon, a move that will see the interaction between the two brains become more efficient. It makes a lot of sense and paves the way for dual-GPU systems to become far more commonplace than they are nowadays.

Over the next few years I fully expect to see GPU power break free from being a gamer metric and become as important as CPU power for all sorts of devices, from desktops to ultra-portables. The GPU will become the new CPU.

Thoughts?

Editorial standards