NVIDIA

NVIDIA streamlines the implementation of AI-powered MetaHumans

Unreal Fest 2024 is happening now in Seattle, and NVIDIA has launched new on-device plugins for NVIDIA ACE in Unreal Engine 5, making it easier to create and implement AI-powered MetaHuman characters on PC. Below is a handy list of all the new features announced.
New ACE Plugins for Unreal Engine 5:

Audio2Face-3D Plugin: Enables lip-syncing and AI-driven facial animation directly within Unreal Engine 5.
Nemotron-Mini 4B Instruct Plugin: Provides response generation for character interactive dialogues.
Retrieval Augmented Generation (RAG) Plugin: Offers contextual information to enhance character interactions.
Audio2Face-3D Plugin for Autodesk Maya:
Facilitates audio-driven facial animations in Maya.
Includes source code for customization and integration with other digital content creation tools.
Simplifies workflow between Maya and Unreal Engine 5.
Unreal Engine 5 Renderer Microservice with Pixel Streaming:
Utilizes Epic’s Unreal Pixel Streaming technology.
Now supports NVIDIA ACE Animation Graph microservice and early access Linux operating system.
Allows developers to stream high-fidelity MetaHuman characters on any device through Web Real-Time Communication (WebRTC).
Benefits for Developers:
Simplifies the integration of AI-powered digital humans into games and applications.
Optimizes microservices for low latency and minimal memory usage on Windows PCs.
Improves scalability for cloud deployment of digital humans.
Provides customizable tools and source code for tailored development needs.
Developers can request early access to download the Unreal Engine 5 renderer microservice with Animation Graph microservice support and Linux operating system. The ACE Maya plugin is available for download on GitHub.

NVIDIA