Your Cart is Empty

Back To Shop


Your Cart is Empty

Back To Shop

Meet the Omnivore: Industrial Designer Blends Art and OpenUSD to Create 3D Assets for AI Training

Editor’s note: This post is a part of our Meet the Omnivore series, which features individual creators and developers who use NVIDIA Omniverse and OpenUSD to accelerate their 3D workflows and create virtual worlds.

As a student at the Queensland University of Technology (QUT) in Australia, Emily Boehmer was torn between pursuing the creative arts or science.

And then she discovered industrial design, which allowed her to dive into research and coding while exploring visualization workflows like sketching, animation and 3D modeling.

Now, Boehmer is putting her skills to practice as a design intern at BMW Group’s Technology Office in Munich. The team uses NVIDIA Omniverse, a platform for developing and connecting 3D tools and applications, and Universal Scene Description — aka OpenUSD — to enhance its synthetic data generation pipelines.

Boehmer creates realistic 3D assets that can be used with, short for Synthetic Object Recognition Dataset for Industries. Published by BMW Group, Microsoft and NVIDIA, helps developers and researchers streamline and accelerate the training of AI for production. To automate image generation, the team developed an extension based on Omniverse Replicator, a software development kit for creating custom synthetic data generation tools.

As part of the team, Boehmer uses Blender and Adobe Substance Painter to design 3D assets with high levels of physical accuracy and photorealism, helping ensure that synthetic data can be used to efficiently train AI models.

All the assets Boehmer creates are used to test and simulate autonomous robots on the NVIDIA Isaac Sim platform, which provides developers a suite of synthetic data generation capabilities that can power photorealistic, physically accurate virtual environments.

Creating Realistic 3D Assets for Training AI 

As a design intern, Boehmer’s main tasks are animation and 3D modeling. The process starts with taking photos of target objects. Then, she uses the 2D photos as references by lining them up with the 3D models in Blender.

3D objects can consist of thousands of polygons, so Boehmer creates two versions of the asset — one with a low number of polygons and one with a higher polygon count. The details of the high-poly version can be baked onto the low-poly model, helping maintain more details so the asset looks realistic.

Once the 3D assets are created, Boehmer uses the models to start assembling scenes. Her favorite aspect of the Omniverse platform is the flexibility of USD, because it allows her to easily make changes to 3D models.

USD workflows have enabled the BMW Group’s design teams to create many different scenes using the same components, as they can easily access all the USD files stored on Omniverse Nucleus. When creating portions of a scene, Boehmer pulls from dozens of USD models from and adds them into scenes that will be used by other designers to assemble larger factory scenes.

Boehmer only has to update the USD file of the original asset to automatically apply changes to all reference files containing it.

“It’s great to see USD support for both Blender and Substance Painter,” she said. “When I create 3D assets using USD, I can be confident that they’ll look and behave as expected in the scenes they’ll be placed in.”

Building Factory Scenes With Synthetic Data

The Isaac Sim platform is a key part of the team’s workflow. It’s used to develop pipelines that use generative AI and procedural algorithms for 3D scene generation. The team also developed an extension based on Omniverse Replicator that automates randomization within a scene when generating synthetic images.

“The role of design interns like me is to realistically model and texture the assets used for scenes built in Isaac Sim,” Boehmer said. “The more realistic the assets are, the more realistic the synthetic images can be and the more effective they are for training AI models for real scenarios.”

Data annotation — the process of labeling data like images, text, audio or video with relevant tags — makes it easier for AI to understand the data, but the manual process can be incredibly time-consuming, especially for large quantities of content. addresses these challenges by using synthetic data to train AI.

When importing assets into Omniverse and creating USD versions of the files, Boehmer tags them with the appropriate data label. Once these assets have been put together in a scene, she can use Omniverse Replicator to generate images that are automatically annotated using the original labels.

And using, designers can set up scenes and generate thousands of annotated images with just one click.

Boehmer will be a guest on an Omniverse livestream on Wednesday, Sept. 20, where she’ll demonstrate how she uses Blender and Substance Painter in Omniverse for synthetic image generation pipelines.

Join In on the Creation

Anyone can build their own Omniverse extension or Connector to enhance their 3D workflows and tools. Creators and developers can download Omniverse for free, and enterprise teams can use the platform for their 3D projects.

Check out artwork from other “Omnivores” and submit projects in the gallery. See how creators are using OpenUSD to accelerate a variety of 3D workflows in the latest OpenUSD All Stars. And connect workflows to Omniverse with software from Adobe, Autodesk, Blender, Epic Games, Reallusion and more.

Get started with NVIDIA Omniverse by downloading the standard license free, or learn how Omniverse Enterprise can connect your team. Developers can get started with Omniverse resources and learn about OpenUSD. Explore the growing ecosystem of 3D tools connected to Omniverse.

Stay up to date on the platform by subscribing to the newsletter, and follow NVIDIA Omniverse on Instagram, Medium and Twitter. For more, join the Omniverse community and check out the Omniverse forums, Discord server, Twitch and YouTube channels.