3D Asset Creation for Simulation and AI Model Training

This blog post provides a high-level overview of the art and science of 3D asset creation, highlighting the techniques, tools, and innovations that empower creators to construct compelling virtual realities and drive the production of accurate, dynamic digital twins. Advancing these methods is the central mission of the DIDYMOS-XR project.   
 
Advancements in Rendering Technologies 

High-quality assets are essential for achieving excellent rendering results, transforming intricate models and textures into breathtaking, lifelike images and animations. From NVIDIA RTX DLSS (Deep Learning Super Sampling) to PBR (Physics-Based Rendering) techniques delivering realistic textures, and multi-GPU setups boosting rendering performance, these core technology advancements ensure visually stunning and immersive simulations.  
 
Streamlining Asset Creation with Connectors 

While rendering technologies enrich the visual fidelity and performance of simulations, connectors streamline the asset creation process by seamlessly integrating various tools and technologies. From Digital Content Creation (DCC) tools like Blender and Autodesk Maya to texturing tools, these components collectively ensure the efficient creation of visually captivating assets. Connecting best-in-class tools centrally enables multiple designers and 3D artists to work in lockstep, accelerating the asset creation process. Furthermore, syncing data across multiple tools opens the possibility for artists to work in their preferred tool, collating work further downstream in the creative process. An efficient and centralized asset creation workflow unlocks exciting, domain specific use cases such as factory or logistics environments.
 
Crafting Lifelike Environments: The Backbone of Simulation Realism 

Essential for creating immersive simulations, 3D assets serve as the backbone of realistic simulation engineering by providing environments, objects, and actors that interact within the simulation.

World creation, for example, includes the entire environment where simulation scenarios occur. Ego devices are the primary entities being simulated, such as vehicles or robots, and their interactions with the environment and other objects. Domain-specific content includes assets tailored to specific contexts like logistics or manufacturing, with each element contributes to crafting realistic and believable virtual environments. 
 
Achieving Fidelity in Simulation Assets
 
Whether through converting existing 3D data – which involves transforming models from various formats into a unified format like OpenUSD for compatibility and ease of use within simulation workflows – or manual modeling, achieving fidelity in 3D assets is crucial for producing accurate and reliable results. The accuracy of models is important for multiple reasons across the entire tool chain. A high level of fidelity is required to not only ensure appropriate visual representation for quality user experiences, but also for function reasons; for example, to ensure accurate model target identification for AR model tracking. Here, the geometry of an asset is referenced to align models to real world objects, and a close match between 3D data and real-world artifacts enables a robust and reliable tracking experience.
 
Advancements in Defining, Managing, and Integrating 3D Assets
 
Effective asset creation extends beyond basic modeling to include defining 3D object properties, such as physical interactions with forces like gravity and collisions, specifying material characteristics like textures and reflectivity for realistic appearances, and adding structural details like hinges and folding mechanisms for authentic behavior.
 
Managing diverse datasets and creating variations are crucial for robust simulations, facilitated by tools like Adobe Substance 3D Sampler for converting physical samples into high-quality 3D materials and applying realistic effects in real time. Connectors play a vital role in integrating various tools and data sources into a seamless workflow. This is exemplified by the Omniverse Connector for Adobe Substance 3D Painter, which promotes non-destructive workflows and collaboration across different stages of asset creation. 
 
Synthetic Data for AI-Driven Simulations
 
In robotics development, the demand for synthetic data to efficiently train and validate AI models is paramount. To tailor an ecosystem that fulfills these demands, researchers develop custom synthetic data generation pipelines. NVIDIA Omniverse offers a powerful ecosystem with various extensions built on top of Omniverse Replicator, to quickly and easily generate synthetic data and further enhance the efficiency and accuracy of AI-driven applications. This empowers us to build large datasets where real data is limited or simply not available for training and validation.  
 
Synthetic Data and Generative AI in Simulation Workflows
 
Looking into enhancing simulation workflows, the integration of synthetic data and generative AI promises to revolutionize asset creation. By streamlining workflows and making asset creation more accessible, this transformative leap empowers creators to push the boundaries of virtual simulations further than ever before.

 

By Dylan Sheppard, idealworks