NVIDIA, Developers Pioneer Avatar Cloud For Lifelike Characters

News Desk -

Share

NVIDIA has unveiled production microservices for the NVIDIA Avatar Cloud Engine (ACE), providing game, tool, and middleware developers with the capability to seamlessly integrate cutting-edge generative AI models into the digital avatars within their games and applications.

With the introduction of ACE microservices, developers can construct interactive avatars utilizing advanced AI models like NVIDIA Audio2Face (A2F), which generates expressive facial animations from audio inputs, and NVIDIA Riva Automatic Speech Recognition (ASR), enabling the creation of customizable multilingual speech and translation applications through generative AI.

Notable developers adopting ACE include Charisma.AI, Convai, Inworld, miHoYo, NetEase Games, Ourpalm, Tencent, Ubisoft, and UneeQ.

Keita Iida, Vice President of Developer Relations at NVIDIA, expressed, “Generative AI technologies are transforming virtually everything we do, and that also includes game creation and gameplay. NVIDIA ACE opens up new possibilities for game developers by populating their worlds with lifelike digital characters while removing the need for pre-scripted dialogue, delivering greater in-game immersion.”

Leading game and interactive avatar developers are at the forefront of exploring how ACE and generative AI technologies can revolutionize interactions between players and non-playable characters (NPCs) in games and applications.

Tencent Games commented, “This is a milestone moment for AI in games. NVIDIA ACE and Tencent Games will help lay the foundation that will bring digital avatars with individual, lifelike personalities and interactions to video games.”

NVIDIA ACE marks a significant shift in bringing game characters to life, as NPCs no longer rely on predetermined responses and facial animations. This departure from scripted interactions enhances player engagement, creating more dynamic and immersive gaming experiences.

Purnendu Mukherjee, Founder and CEO at Convai, emphasized the transformative potential of generative AI-powered characters in virtual worlds, stating, “Convai is leveraging Riva ASR and A2F to enable lifelike NPCs with low-latency response times and high-fidelity natural animation.”

To illustrate the transformative impact of ACE on NPC interactions, NVIDIA collaborated with Convai to enhance the NVIDIA Kairos demo. The latest version of Kairos incorporates Riva ASR and A2F extensively, enhancing NPC interactivity and enabling NPCs to engage in conversations, demonstrate awareness of objects, and perform actions such as picking up and delivering items. NPCs can now guide players to objectives and traverse virtual worlds.

The Audio2Face and Riva Automatic Speech Recognition microservices are currently available, allowing interactive avatar developers to seamlessly integrate these models into their development pipelines.


Leave a reply