Posted on

Nvidia releases plugins to improve digital human realism on Unreal Engine 5

Nvidia releases plugins to improve digital human realism on Unreal Engine 5

Nvidia has released its latest technology for creating AI-powered characters that look and behave like real people.

At Unreal Fest Seattle 2024, Nvidia released its new Unreal Engine 5 on-device plugins for Nvidia Ace, making it easier to create and deploy AI-powered MetaHuman characters on Windows PCs. Ace is a suite of digital human technologies that deliver speech, intelligence and animation using generative AI.

Developers can now access a new Audio2Face 3D plugin for AI-powered facial animations (where lips and faces move in sync with audio speech) in Autodesk Maya. This plugin provides developers with a simple and streamlined user interface to speed up and simplify avatar development in Maya. The plugin comes with source code so developers can dive in and develop a plugin for the digital content creation (DCC) tool of their choice.

Finally, we built an Unreal Engine 5 renderer microservice that leverages Epic’s Unreal Pixel Streaming technology. This microservice now supports the Nvidia Ace Animation Graph microservice and the Linux operating system in early access. The Animation Graph microservice enables realistic and responsive character movement, and with Unreal Pixel Streaming support, developers can stream their MetaHuman creations to any device.


Join GamesBeat Next!

GamesBeat Next is almost here! GB Next is the premier event for product leaders and executives in the gaming industry. On October 28th and 29th, meet other leaders and great speakers like Matthew Bromberg (CEO Unity), Amy Hennig (Co-President of New Media Skydance Games), Laura Naviaux Sturr (GM Operations Amazon Games) and Amir Satvat (Business Development Director). Tencent) and so many others. View the full list of speakers and register here.


Nvidia makes it easier to create MetaHumans with ACE.

The Nvidia Ace Unreal Engine 5 sample project serves as a guide for developers who want to integrate digital humans into their games and applications. This example project expands the number of ACE plugins on the device:

  • Audio2Face-3D for lip sync and facial animation
  • Nemotron Mini 4B response generation instruction
  • RAG for contextual information

According to Nvidia, developers can create and generate a database full of contextual information about their intellectual property
relevant responses with low latency and let those responses seamlessly drive corresponding MetaHuman facial animations in Unreal Engine 5. Each of these microservices is optimized to run on Windows PCs with low latency and minimal memory requirements.

Nvidia has introduced a series of tutorials on setting up and using the Unreal Engine 5 plugin. The new plugins are coming soon and to get started today, make sure developers have downloaded the relevant Nvidia Ace plugin and Unreal Engine sample along with a MetaHuman character.

Autodesk Maya provides powerful animation capabilities for game developers and technical artists to create high-quality 3D graphics. Now developers can more easily create high-quality, audio-driven facial animations for any character with the Audio2Face 3D plugin. The user interface has been optimized and you can seamlessly switch to the Unreal Engine 5 environment. The source code and scripts are highly customizable and can be modified for use in other digital content creation tools.

To get started with Maya, developers can obtain an API key or download the Audio 2Face-3D NIM. Nvidia NIM is a set of easy-to-use AI inference microservices that accelerate deployment of base models on any device
Cloud or data center. Then make sure you have Autodesk Maya 2023, 2024 or 2025. Access the Maya ACE Github repository, which contains the Maya plugin, gRPC client libraries, testing resources, and a sample scene – everything you need to explore, learn, and innovate with Audio2Face-3D.

Developers deploying digital humans over the cloud are trying to reach as many customers as possible at the same time, but streaming characters at high fidelity requires significant computing resources. Today, the latest Unreal Engine 5 renderer microservice in Nvidia Ace brings support for the Nvidia Animation Graph microservice and the Linux operating system in early access.

Animation Graph is a microservice that facilitates the creation of animation state machines and blend trees. It provides developers with a flexible, node-based system for animation mixing, playback, and control.

The new Unreal Engine 5 renderer microservice with pixel streaming leverages data from the Animation Graph microservice and allows developers to run their MetaHuman character on a server in the cloud and send its rendered frames and audio to any browser via Web Real-Time and any edge device to stream communications (WebRTC).

Developers can apply for early access to download the Unreal Engine 5 renderer microservice today. Learn more about Nvidia Ace and download NIM microservices to start developing game characters powered by generative AI.

Developers can apply for early access to download the Unreal Engine 5 renderer microservice with support for the Animation Graph microservice and the Linux operating system. The Maya Ace plugin is available for download on GitHub.…