X
Tech

NVIDIA: Quad core can actually use less power than dual core

NVIDIA introduced the first dual core ARM chips and is now prepping quad core, while still preserving respectable battery life. Multiple cores are part of the answer to better battery life. See why.
Written by Jason Hiner, Editor in Chief

Since early 2011, NVIDIA has been on a tear in the mobile market, jumping ahead of Qualcomm and Texas Instruments. NVIDIA's Tegra 2 dual core processor was everywhere at CES 2011 in January, powering nearly all of the most highly-anticipated Android devices from the Motorola Xoom to the ASUS Eee Pad to the Motorola Atrix to the T-Mobile G-Slate.

But, while NVIDIA gave us the first dual core ARM processor, its Tegra chips also provide pretty good battery life for the smartphones and tablets that use them. I spoke with NVIDIA at CTIA Wireless 2011 about how it is pulling off strong performance and long battery life at the same time, and the company attributed it to two factors:

  1. The paradoxical fact that the more cores a chip has, the less power drain for many activities
  2. Tighter integration between CPU and GPU than rivals

The NVIDIA Tegra, which is a system-on-a-chip (SoC) that contains both CPU and GPU, is smaller than a dime. Photo credit: Jason Hiner

More cores, less power drain

Last month, as TI and Qualcomm were announcing their dual core chips that will come out later this year to match the Tegra 2, NVIDIA unveiled its quad core chips under the code name Project Kal-El. I talked to NVIDIA Director of Technical Marketing Nick Stam at CTIA and he said tablets running NVIDIA quad core chips are on track to arrive on tablets in August and on smartphones before the end of the year.

However, when pressed about what quad core will mean for battery life -- a major concern for the average user -- Stam remarked that quad core can actually be better for battery life than dual core. And, he said the current dual core ARM chips can be better on battery life than their single core counterparts.

The reason that's the case, according to Stam, is that when an application is properly multi-threaded and the workload is spread across two CPUs, it keeps the voltage and frequency of the CPUs lower on both cores. However, the same workload on one CPU would max out the voltage and frequency on the CPU and when the CPU gets maxed out, it runs hotter and that's when it draws a lot more power. Take a look at the two charts below for a visual on this concept (you can click the images for a larger view).

Naturally, if you have a heavy workload that maxes out both cores of a dual core CPU -- or all four cores on a quad core -- then that will use more power and drain the battery more than a chip with fewer cores. But, for activities like average Web browsing, email, and even basic video playback, these things can be spread across cores so that they're less taxing on the system and draw less power.

Aside from my meeting with NVIDIA, at CTIA I spoke separately with LG, which uses chips from TI, Qualcomm, and NVIDIA in its various smartphones. LG pointed out that its devices running the Tegra 2 are seeing the best battery life of the bunch.

For more background on this issue (from NVIDIA's perspective), take a look at the NVIDIA white paper The Benefits of Multiple CPU Cores in Mobile Devices.

Integration of CPU and GPU

The other factor that NVIDIA cited as a source of the power efficiency of its chips is its tight integration between CPU and GPU. In general, this is a primary benefit of the system-on-a-chip design, which both Qualcomm and TI use as well. The advantage for NVIDIA is that it has been designing its own graphics chips for a couple decades so it has a lot of expertise and intellectual property that it can use to squeeze out performance and efficiency (see diagram of chip design below).

Credit: NVIDIA

Stam said that the fact NVIDIA actually designs its graphics processor is an important part of the advantage, since graphics and display are some of the biggest factors that impact battery life in mobile devices. NVIDIA CPUs smartly tell their graphic chips to power down or power off to conserve power during the average day of mobile use, and over time that adds up to battery life gains.

Conversely, Qualcomm doesn't have the same expertise in graphics chips. As a result, Qualcomm-powered devices like the HTC ThunderBolt -- which is an absolute speed demon -- suffer from poor battery life. While we can't pin all of the battery issues of the ThunderBolt (and other HTC devices like the EVO and the Incredible) on their Qualcomm chips, the CPU and graphics are an critical factor in the equation. Also, don't look for HTC to start using NVIDIA chips any time soon. The company has a long-term partnership with Qualcomm and Qualcomm is an investor in HTC.

A couple other interesting fact about NVIDIA's forthcoming quad core chip -- it will include a GeForce GPU with 12 cores and it can power displays up to 2560x1600.

I will be following up with Qualcomm and Texas Instruments to get their responses to these issues and their strategies for countering NVIDIA.

This article was originally published by TechRepublic.

Editorial standards