Nvidia has announced two new simulation engines on Omniverse Cloud, the virtual factory simulation engine and the autonomous vehicle (AV) simulation engine, to help unify digitalisation across the automotive design, engineering and manufacturing industry.
Omniverse Cloud, a platform-as-a-service for developing and deploying applications for industrial digitalisation, is hosted on Microsoft Azure, something that it hopes will allow enterprises to achieve faster production and more efficient operations, improving time to market and enhancing sustainability initiatives.
The virtual factory simulation engine is a collection of customisable developer applications and services that enable factory planning teams to connect large-scale industrial datasets while collaborating, navigating and reviewing them in real time.
Design teams using 3D data can assemble virtual factories and share their work with planners who can view, annotate and update the full-fidelity factory dataset from lightweight devices.
By simulating virtual factories on Omniverse Cloud, Nvidia says automakers can increase throughput and production quality while saving years of effort and millions of dollars that would result from making changes once construction is underway.
Teams can create interoperability between existing software applications such as Autodesk Factory Planning, which supports the entire lifecycle for building, mechanical, electrical, and plumbing and factory lines, as well as Siemens’ NX, Process Simulate and Teamcenter Visualization software and the JT file format. They can share knowledge and data in real time in live, virtual factory reviews across 2D devices or in extended reality.
The AV simulation engine is a service that can deliver physically based sensor simulation, enabling AV and robotics developers to run autonomous systems in a closed-loop virtual environment.
Nvidia expects the next generation of AV architectures to be built on large, unified AI models that combine layers of the vehicle stack, including perception, planning and control, with such new architectures call for an integrated approach to development.
With previous architectures, developers could train and test these layers independently, as they were governed by different models. For example, simulation could be used to develop a vehicle’s planning and control system, which only needs basic information about objects in a scene — such as the speed and distance of surrounding vehicles — while perception networks could be trained and tested on recorded sensor data.
However, using simulation to develop an advanced unified AV architecture requires sensor data as the input. For a simulator to be effective, it must be able to simulate vehicle sensors, such as cameras, radars and lidars, with high fidelity.
To address the challenge, Nvidia is bringing state-of-the-art sensor simulation pipelines used in DRIVE Sim and Isaac Sim to Omniverse Cloud on Microsoft Azure.
AV and robotics workflows with high-fidelity, physically based simulation for cameras, radars, lidars and other types of sensors can be connected to existing simulation applications, whether developed in-house or provided by a third party, via Omniverse Cloud application programming interfaces for integration into workflows.
The factory simulation engine is now available to customers via an Omniverse Cloud enterprise private offer through the Azure Marketplace, which provides access to Nvidia OVX systems and fully managed Omniverse software, reference applications and workflows. The sensor simulation engine is coming soon.