X
Tech

CES 2022: AMD, Intel, and Nvidia make CPUs and GPUs buddy up

Smarter collaboration between a PC's two main computational engines is boosting performance and power efficiency.
Written by Ross Rubin, Contributor

Late last year, I wrote about Apple's first M1 series-powered MacBook Pros and how the company spared no opportunity to bring out the big benchmark guns against its previous efforts as well as rivals. At CES, the empires (at least those that rule PC chips) struck back, with AMD, Intel, and Nvidia all announcing new versions of flagships that address the need to deliver more performance more efficiently. Among the techniques they've addressed, they've all performance and efficiency by tapping into the versatility of the Windows ecosystem for new ways for CPUs and GPUs to work together. The moves are in part a counter against Apple, which has achieved impressive performance with its integrated GPU, but has not developed its own discrete GPU (yet),

With AMD having the longest history offering both CPUs and discrete GPUs, it's been no surprise to see the company embrace more intelligent power shifts between the two. The company upped its SmartShift technology for routing computational load between CPU and GPU to SmartShift Max. The company claimed the tech now can accelerate a broader array of games and workflow, citing examples of some games going from no acceleration to eking out as much as a 13% improvement.

Discrete GPU newcomer Intel showed off a wider range of integration benefits between its CPUs and Arc discrete graphics processors under the banner of Deep Link. While the company briefly discussed a range of power-routing tricks that can improve gaming performance, it also showed how its discrete GPUs can work in conjunction with its Core architecture's integrated GPU. Using this approach, for example, video editing app DaVinci Resolve could farm out encoding alternating frames of a video between each graphics processing option, which Intel claims could produce a 40 percent improvement in rendering times.

Finally, Nvidia's absence from the PC CPUs hasn't stopped it from developing its own approach to a partnership between its discrete GPUs and leading CPUs. As part of the fourth generation of its AI-fueled Max-Q approach to system design, the company detailed CPU Optimizer, a low-level framework that enables the company's GPUs to regulate the performance, temperature, and power of next-gen CPUs. This results in more efficient CPU performance, which allows more power to be transferred to GPUs. The company also discussed its second generation of Battery Boost technology, which considers factors such as CPU and GPU power usage, battery discharge, image quality, and frame rates in real-time to produce up to a 70% improvement in battery life when playing games.

The three major PC silicon providers presented their approaches to tighter CPU-GPU collaboration, along with advances in AI-powered upsampling such as Nvidia's Deep Learning Super Sampling, in the context of improving game performance. Such improvements go hand-in-hand with improvements for many creative software apps. Over time, the boosts could filter down to more mainstream applications, particularly as AR and VR applications evolve.

Editorial standards