Building AI Avatars with NVIDIA Omniverse ACE

The Omniverse Avatar Cloud Engine is now available for early access, allowing for the customization and deployment of interactive avatars.

Building AI Avatars with NVIDIA Omniverse ACE

Developers and teams building avatars and virtual assistants can now register for the NVIDIA Omniverse Avatar Cloud Engine (ACE) early access program. This is a suite of cloud-native AI microservices that facilitates the easier large-scale construction and deployment of intelligent virtual assistants and digital humans.

Omniverse ACE simplifies avatar development, providing the necessary AI building blocks for adding intelligence and animation to any avatar, which can be built on nearly any engine and deployed in any cloud. These AI assistants cater to various industries, helping organizations enhance current workflows and unlock new business opportunities.

ACE is one of several generative AI applications that help creators accelerate the development of 3D worlds and the metaverse. Members of the program will gain access to pre-release versions of NVIDIA AI microservices, along with the tools and documentation needed to develop cloud-native AI workflows for interactive avatar applications.

With Omniverse ACE, Create Lifelike Interactive AI Avatars

Avatar development typically requires expertise, specialized equipment, and labor-intensive workflows. To simplify avatar creation, Omniverse ACE allows NVIDIA’s AI technologies (including pre-built models, toolsets, and domain-specific reference applications) to be seamlessly integrated into avatar applications built on most engines and deployed in public or private clouds.

Since its launch in September, Omniverse ACE has been shared with select partners for early feedback. NVIDIA is now seeking more partners to provide feedback on the microservices, collaborate on product improvements, and innovate in the realm of realistic interactive digital humans.

The early access program includes access to pre-release versions of ACE animation AI and conversational AI microservices, including:

  • 3D Animation AI Microservice: For third-party avatars, using Omniverse Audio2Face generative AI to create realistic facial animations based solely on audio files, making characters in Unreal Engine and other rendering tools come to life.

  • 2D Animation AI Microservice: Named Live Portrait, it easily creates 2D portraits or stylized face animations using real-time video sources.

  • Text-to-Speech Microservice: Using NVIDIA Riva TTS, it synthesizes natural-sounding speech based on raw transcribed text without any additional information (such as voice patterns or rhythms).

Members of the program will also gain access to tools, sample reference applications, and support resources that will assist them in getting started.

Avatars are Applied Across Industries

Omniverse ACE enables teams to build interactive digital humans, enhancing experiences across various industries by providing:

  • Simple Character Animation: Allowing users to create lifelike character animations with minimal expertise.

  • Cloud Deployment Capability: Meaning avatars can be used virtually anywhere, such as in fast-food kiosks, tablets, or VR headsets.

  • Plug-and-Play Suite Built on NVIDIA Unified Computing Framework (UCF): Ensuring interoperability between NVIDIA AI and other solutions, guaranteeing that cutting-edge AI can be applied to all use cases.

Partners like Ready Player Me and Epic Games have experienced how Omniverse ACE enhances the workflow for AI avatars.

The Omniverse ACE animation AI microservice supports 3D characters from Ready Player Me, a platform for building cross-game avatars.

Timmu Tõke, CEO and co-founder of Ready Player Me, stated: “Digital avatars are becoming an essential part of our daily lives. People use avatars in games, virtual events, and social applications, and even as a way to enter the metaverse. We spent seven years building the perfect avatar system that allows developers to easily integrate into their applications and games, enabling users to create an avatar to explore different worlds. Now, with NVIDIA Omniverse ACE, teams can more easily make these characters come to life.”

Building AI Avatars with NVIDIA Omniverse ACE

Epic Games’ advanced MetaHuman technology is revolutionizing the creation of realistic, high-fidelity digital humans. Combining Omniverse ACE with the MetaHuman framework will enable users to design and deploy engaging 3D avatars more easily.

Digital humans can not only have conversational abilities but can also become singers, just like the AI avatar Toy Jensen (TJ). NVIDIA’s creative team quickly produced an animated video of TJ singing “Jingle Bells,” using Omniverse ACE to extract the singer’s voice and convert it to TJ’s voice. This allows the avatar to sing at the same pitch and rhythm as the original artist.

Building AI Avatars with NVIDIA Omniverse ACE

Many creators are trying out VTubing, a new streaming method where users appear with 2D avatars and interact with the audience. With Omniverse ACE, creators can convert avatars from 2D animation to 3D, including photorealistic and stylized heads. Users can render avatars from the cloud, animating characters anywhere.

Additionally, the NVIDIA Tokkio reference application is expanding, with early partners building cloud-native customer service avatars for industries like telecommunications and banking.

Join the Early Access Program

Developers and teams building avatars and virtual assistants can experience Omniverse ACE early.

Watch NVIDIA’s Special Presentation at CES. Learn more about NVIDIA Omniverse ACE and register for the early access program.

Click “Read the Original” now or scan the QR code below, to watch NVIDIA’s special presentation at CES and learn about the latest breakthroughs in accelerated computing and AI!

Building AI Avatars with NVIDIA Omniverse ACE

Leave a Comment