Nvidia has released the latest technology to create AI-powered characters that look and behave like real humans.
At Unreal Fest Seattle 2024, Nvidia released a new Unreal Engine 5 on-device plugin for Nvidia Ace. This makes it easy to build and deploy AI-powered MetaHuman characters on your Windows PC. Ace is a suite of digital human technologies that deliver voice, intelligence, and animation powered by generative AI.
Developers now have access to the new Audio2Face-3D plug-in for AI-powered facial animation (lips and face movement in sync with audio audio) in Autodesk Maya. This plugin provides developers with a simple and streamlined interface to speed up and ease avatar development in Maya. This plugin comes with source code, allowing developers to develop plugins for their favorite digital content creation (DCC) tools.
Finally, we built an Unreal Engine 5 renderer microservice that leverages Epic’s Unreal Pixel Streaming technology. This microservice now supports the Nvidia Ace Animation Graph microservice and Linux operating systems in early access. Animation graph microservices enable realistic, responsive character movement, and support for Unreal Pixel Streaming lets developers stream MetaHuman creations to any device.
Join us on GamesBeat Next!
GamesBeat Next is almost here! GB Next is the premier event for product leaders and leaders in the games industry. On October 28th and 29th, we will be featuring leaders and amazing people like Matthew Bromberg (CEO of Unity), Amy Hennig (Co-President of New Media Skydance Games), Laura Naviaux Sturr (GM of Amazon Games Operations), and Amir Satvat (Director of Business Development). Join the speakers. Tencent), and many others. See the complete speaker list and register here.
The Nvidia Ace Unreal Engine 5 sample project serves as a guide for developers looking to integrate digital humans into their games and applications. This sample project expands the number of ACE plugins on your device.
Audio2Face-3D (for lip sync and facial animation) Nemotron-Mini 4B Response generation instructions RAG (for context information)
Nvidia allows developers to build a database full of the contextual lore of their intellectual property;
Generate relevant responses with low latency and let those responses seamlessly drive corresponding MetaHuman facial animations in Unreal Engine 5. Each of these microservices is optimized to run on Windows PCs with low latency and minimal memory footprint.
Nvidia has published a series of tutorials on setting up and using Unreal Engine 5 plugins. New plugins will be published soon. To get started today, make sure developers have downloaded the appropriate Nvidia Ace plugin and Unreal Engine samples along with their MetaHuman character.
Autodesk Maya provides high-performance animation capabilities for game developers and technical artists to create high-quality 3D graphics. Developers can now easily generate high-quality audio-driven facial animations for any character using the Audio2Face-3D plugin. The user interface has been streamlined for seamless migration to the Unreal Engine 5 environment. The source code and scripts are highly customizable and can be modified for use with other digital content creation tools.
To get started with Maya, developers obtain an API key or download the Audio 2Face-3D NIM. Nvidia NIM is a set of easy-to-use AI inference microservices that accelerates deployment of foundational models in any environment.
cloud or data center. Next, make sure you have Autodesk Maya 2023, 2024, or 2025. Access the Maya ACE Github repository. It includes everything you need to explore, learn, and innovate with Audio2Face-3D, including Maya plugins, gRPC client libraries, test assets, and sample scenes.
Developers deploying digital humans through the cloud are trying to reach as many customers as possible simultaneously, but streaming high-fidelity characters requires significant computing resources. Today, Nvidia Ace’s latest Unreal Engine 5 renderer microservice adds support for the Nvidia Animation Graph microservice and the Linux operating system in early access.
Animation Graph is a microservice that facilitates the creation of animated state machines and blend trees. This gives developers a flexible node-based system for blending, playing, and controlling animations.
The new Unreal Engine 5 Renderer microservice with pixel streaming consumes data from the Animation Graph microservice, allowing developers to run MetaHuman characters on servers in the cloud and stream their rendered frames and audio to the web. Stream to your browser or edge device via real-time. Communication (WebRTC).
Developers can apply for early access today to download the Unreal Engine 5 renderer microservice. Learn more about Nvidia Ace, download NIM microservices, and start building game characters with generative AI.
Developers will now be able to apply for early access to download the Animation Graph microservice and the Unreal Engine 5 renderer microservice supporting Linux OS. The Maya Ace plugin can be downloaded from GitHub.