Nvidia claims Tegra 4 GPU will outperform the iPad 4's A6X

Nvidia's new Tegra 4 processor will give the company a significant advantage on Apple and its A6X processor... but for how long?

While few people seem to care about processors speeds on the desktop, it's a hot topic when it comes to mobile processors, and Nvidia has just raised the stakes with its new Tegra 4 processor.

The Tegra 4 has at its heart 72 custom Nvidia GeForce GPU cores , giving it six-times the graphics processing power of the current Tegra 3. Backing up the GPU is a new quad-core variant of ARM's Cortex-A15 CPU, a chip that Nvidia claims "delivers 2.6x faster Web browsing and breakthrough performance for apps."

According to AnandTech, the Tegra 4 processor will have six-times the arithmetic logic units (ALUs) that are present in the Tegra 3. If it is assumed that the Tegra 4 GPU cores will operate at 520 MHz -- which is the fastest that the Tegra 3 could go -- this means that the GPU will be capable of 74.8 GFLOPS (billion FLoating-point Operations Per Second) compared to the 71.6 GFLOPS that the PowerVR SGX 554MP4 inside Apple's A6X.

See alsoIntel should dump x86 and make ARM chips, says executive

At CES 2013 Nvidia made the assertion that the Tegra 4 will be faster than the A6X both in 3D games and in GLBenchmark, but didn't provide further details on the matter.

This week, GLBenchmark results claiming to show Tegra 4 performance were leaked to the Web, but it appears that these are either fake or relate to prototype versions of the Tegra 4 running at much lower clock speeds.

While this seems to give Nvidia an advantage over Apple, the glory could be short-lived. Imagination Technologies, makers of the PowerVR GPU inside many smartphones and tablets -- including the iPhone and iPad -- have announced that its sixth-generation graphics core can deliver "20x or more" the performance of current-generation hardware while at the same time being five times more efficient.

Things are really starting to heat up in the mobile sector.

Image source: Nvidia.