Nvidia's robot simulator now includes human characters too

CES 2023: Nvidia unveils improvements to its robotics simulation toolkit -- including the ability to test how robots react to humans.
Written by Danny Palmer, Senior Writer
Image: Nvidia

Nvidia has announced updates to its robotics simulation tools which will enable organizations to build and test virtual robots in a variety of realistic environments and operation conditions -- all from safely inside a cloud environment. 

Also: CES 2023 tech you can actually buy now

Revealed at CES 2023, Nvidia's robotics simulation toolkit can simulate human behavior in industrial environments like warehouses and manufacturing facilities. The goal is to help collaborative robots (cobots) or autonomous mobile robots (AMRs) understand and identify common behaviors and potential obstacles in the real world. 

The use of industrial and commercial robots is growing rapidly and according to Nvidia, the improvements to the robotics platform will accelerate the development and deployment of autonomous robots. Developing the artificial intelligence (AI) can ensure the robots successfully and safely operate in a variety of environments.

"Simulation is the critical technology that will allow the development of the complex software systems that will power the coming wave of smarter, more autonomous robots. In simulation, the virtual robots have a proving ground for their complex software stacks and multitude of AI models," said Gerard Andrews, senior product manager at Nvidia.

Also: The drone wranglers: How the most authentic Old West town in the US is delivering the future of flight

By adding simulations of human behaviour and interaction within environments, like picking up and moving items, pushing carts and moving to new locations, it's possible to test how adding robots to the environment would potentially play out, without endangering people.

Both normal events, like people interacting with industrial settings or moving around a warehouse, and abnormal events, such as unexpected emergencies and other scenarios, can be simulated in order to help build robots that appropriately react to situations in busy environments. 

Also: Nvidia's latest Studio Laptop series showcases its fastest, most powerful GPUs

To aid this, using Nvidia RTX technology, Isaac Sim has improved sensor support, allowing it to render physically accurate data from sensors in real time, which includes ray tracing to provide more accurate sensor data under various lighting conditions and in response to reflective materials.  

This allows simulated worlds to be based upon physically accurate sensor models, minimizing the differences between the simulation and the real environment, to ensure the robots are as accurately trained as possible. 

The new version of Isaac Sim also provides numerous new simulation-ready 3D assets -- including warehouse parts and popular robots, so developers and users can quickly start building.

Built on Nvidia Omniverse, the company's platform for creating and operating metaverse applications, Isaac Sim is accessible via the cloud, providing teams working on robotics projects with the ability to collaborate with increased accessibility, agility and scalability for testing and training virtual robots. 

"With cloud access and its expansive set of photoreal and physically accurate simulation capabilities, Isaac Sim is set to establish new methodologies for the development of intelligent robots," said Andrews. 


Editorial standards