Since early 2011, NVIDIA has been on a tear in the mobile market, jumping ahead of Qualcomm and Texas Instruments. NVIDIA's Tegra 2 dual core processor was everywhere at CES 2011 in January, powering nearly all of the most highly-anticipated Android devices from the Motorola Xoom to the ASUS Eee Pad to the Motorola Atrix to the T-Mobile G-Slate.
But, while NVIDIA gave us the first dual core ARM processor, its Tegra chips also provide pretty good battery life for the smartphones and tablets that use them. I spoke with NVIDIA at CTIA Wireless 2011 about how it is pulling off strong performance and long battery life at the same time, and the company attributed it to two factors:
The NVIDIA Tegra, which is a system-on-a-chip (SoC) that contains both CPU and GPU, is smaller than a dime. Photo credit: Jason Hiner
However, when pressed about what quad core will mean for battery life -- a major concern for the average user -- Stam remarked that quad core can actually be better for battery life than dual core. And, he said the current dual core ARM chips can be better on battery life than their single core counterparts.
The reason that's the case, according to Stam, is that when an application is properly multi-threaded and the workload is spread across two CPUs, it keeps the voltage and frequency of the CPUs lower on both cores. However, the same workload on one CPU would max out the voltage and frequency on the CPU and when the CPU gets maxed out, it runs hotter and that's when it draws a lot more power. Take a look at the two charts below for a visual on this concept (you can click the images for a larger view).
Naturally, if you have a heavy workload that maxes out both cores of a dual core CPU -- or all four cores on a quad core -- then that will use more power and drain the battery more than a chip with fewer cores. But, for activities like average Web browsing, email, and even basic video playback, these things can be spread across cores so that they're less taxing on the system and draw less power.
Aside from my meeting with NVIDIA, at CTIA I spoke separately with LG, which uses chips from TI, Qualcomm, and NVIDIA in its various smartphones. LG pointed out that its devices running the Tegra 2 are seeing the best battery life of the bunch.
For more background on this issue (from NVIDIA's perspective), take a look at the NVIDIA white paper The Benefits of Multiple CPU Cores in Mobile Devices.
Credit: NVIDIA
Stam said that the fact NVIDIA actually designs its graphics processor is an important part of the advantage, since graphics and display are some of the biggest factors that impact battery life in mobile devices. NVIDIA CPUs smartly tell their graphic chips to power down or power off to conserve power during the average day of mobile use, and over time that adds up to battery life gains.
Conversely, Qualcomm doesn't have the same expertise in graphics chips. As a result, Qualcomm-powered devices like the HTC ThunderBolt -- which is an absolute speed demon -- suffer from poor battery life. While we can't pin all of the battery issues of the ThunderBolt (and other HTC devices like the EVO and the Incredible) on their Qualcomm chips, the CPU and graphics are an critical factor in the equation. Also, don't look for HTC to start using NVIDIA chips any time soon. The company has a long-term partnership with Qualcomm and Qualcomm is an investor in HTC.
A couple other interesting fact about NVIDIA's forthcoming quad core chip -- it will include a GeForce GPU with 12 cores and it can power displays up to 2560x1600.
I will be following up with Qualcomm and Texas Instruments to get their responses to these issues and their strategies for countering NVIDIA.
This article was originally published by TechRepublic.