Video: CES 2018: 3 themes to watch
LAS VEGAS -- Graphics chip giant Nvidia on Sunday unveiled the next generation of its autonomous driving stack powered by Xavier, the company's $2 billion R&D bet on automotive AI.
Touted by Nvidia as the "world's most powerful SoC (system on a chip)," Xavier was built to process Level 5 autonomous driving data from a vehicle's radar, cameras, Lidar and ultrasonic systems with more energy efficiency and a smaller form factor than anything else on the market.
Xavier has more than 9 billion transistors with a custom 8-core CPU, a 512-core Volta GPU, an 8K HDR video processor, a deep-learning accelerator, new computer-vision accelerators, and the ability to perform 30 trillion operations per second on 30 watts of power.
Xavier was first teased a year ago but Nvidia says it now forms the basis of two software platforms -- Drive IX and the just announced Drive AR -- all of which are part of Nvidia's Pegasus AI computing platform. Nvidia chief executive Jensen Huang announced the new platforms during his Sunday night keynote address at CES.
Drive IX is the artificial intelligence layer that Nvidia says will enable automakers to incorporate automotive data into the user experience and operational controls of the vehicle. The system analyzes sensor data from inside and outside of the car to provide both drivers and passengers with AI assistant technology that recognizes their faces, voices and gestures.
Read also: Nvidia looks to reduce AI training material through 'imagination' | Nvidia, Nuance team up to bring AI, machine learning to radiologists | Nvidia expands new GPU cloud to HPC applications | Nvidia's data center business hits $2 billion annual revenue run rate
For Nvidia, autonomous driving and a leadership position means more use cases for its GPUs, which are used for high-performance computing as well as the cloud.
Nvidia is pushing Drive IX as the ultimate SDK for vehicle safety and convenience. For instance, with facial recognition and gaze tracking, the system can alert a sleepy driver before they nod off behind the wheel, or recognize their face with they approach the vehicle with a cart full of groceries and automatically open the lift gate.
Drive AR, meanwhile, is geared toward infotainment applications and will essentially fuse computer vision, computer graphics and AI to power augmented reality interfaces inside a vehicle. Nvidia also introduced AutoSim, a simulated, virtual driving environment that lets users configure the cameras on a vehicle before production.
"Augmented reality is going to define user interfaces," Huang said.
All of this is tied to the Drive Pegasus AI computing platform, which Nvidia says "delivers the performance of a trunk full of PCs in an auto-grade form factor the size of a license plate" with 320 trillion operations per second of processing power. For Nvidia customers, the Pegasus motherboard is pitched as the path to production for fully autonomous vehicles.
"We developed the entire stack of autonomous vehicle software," Huang said. "Today we have over 300 customers developing on Nvidia Drive."
In more customer news, Nvidia just revealed that German automaker Volkswagon plans to build its next generation of intelligent VW vehicles on the Drive IX platform. Volkswagen CEO Herbert Diess joined Huang on stage to highlight the new VW I.D. Buzz, a prototype electric vehicle that Volkswagon hopes will cement its place as a market leader for AI computing in cars.
Uber is also stepping out as an Nvidia customer, with news that the ride-share company has been using Nvidia GPUs for its AI computing system in its fleet of self-driving cars and trucks.