Nvidia ACE: more realism in video games thanks to AI

0
2414

Artificial intelligence can be used in many domains, and gaming has already made the most of this technology, which is now evolving to become more realistic. The world’s leading GPU manufacturer, Nvidia, is back at it again with ACE, promising even more immersive interactions with NPCs (Non Player Characters), who become capable of responding to and chatting with the player in real time.

Inworld AI's Covert Protocol demo lets you discover the possibilities of Nvidia ACE.
The Covert Protocol demo lets you discover Nvidia ACE, which uses artificial intelligence. NPCs can therefore enhance immersion in video games.

Artificial intelligence has already arrived in video games

AI-based technologies are numerous, and video games are already benefiting from them. Nvidia’s DLSS (Deep Learning Super Sampling) and AMD’s FSR (FidelityFX Super Resolution) stem from research into artificial neural networks. These technologies deliver more frames per second, without sacrificing graphics quality.

Ubisoft’s latest work makes it possible to animate characters realistically, without recording an actor’s performance. ZeroEGGS technology is capable of producing animations using short video sequences as templates. This simplifies developers’ work and makes games more dynamic.

YouTube player

What are Nvidia ACE and Nvidia NeMo?

Presented at CES 2024, Nvidia ACE (Avatar Cloud Engine) animates NPCs (Non-Player Characters) with realistic movements and facial expressions. Calculated in real time, this makes for more convincing characters in video games. The GPU manufacturer has also unveiled tools for developers, to facilitate the implementation of this technology.

Based on Nvidia NeMo generative artificial intelligence, it is a ChatGPT-like language model capable of producing content from simple prompts. In practice, all you have to do is tell the AI what universe the game is set in, the character’s history and traits, and they will automatically come to life. Nvidia provides an application programming interface (API) that can be used in any application, including video games.

At the 2024 Game Developers Conference, Nvidia provided a little more information about this turnkey solution. The possibilities of this technology were explained through a demo, called Covert Protocol and developed by Inworld AI. NPCs are aware of the world around them, are capable of learning, and adapt their speech according to their interactions with the player.

Nvidia ACE opens the door to a dynamic narrative that adapts to the player's choices.
Thanks to Nvidia ACE’s artificial intelligence, the game’s scenario can evolve dynamically. Depending on the exchanges with the player, the story adapts for greater immersion.

How does the Covert Protocol demo work?

This technical demo runs on Unreal Engine 5 and uses 3D characters generated by Epic Games’ MetaHuman. It uses Inworld AI’s SDK, which bridges the gap between these 3D models and Nvidia ACE. This incorporates Nvidia Riva automatic speech recognition (ASR) and Nvidia Audio2Face (A2F), which produces realistic animations.

Inworld AI’s development tools act as a conductor for all these technologies, generating dialogues based on interactions and producing realistic animations. They are also capable of generating voices with precise intentions, depending on the topic of conversation or emotion. In practice, this could result in dynamic narratives and dialogue, dictated by the player’s choices.

For Kylan Gibbs, CEO of Inworld AI, this demo “only scratches the surface of what’s possible”. This is why Covert Protocol‘s source code is made available to developers, so that they can take it one step further. In this way, they can more easily appropriate these technologies and build on an existing model.

YouTube player

Neo NPC: Inworld AI technology for Ubisoft’s NPCs

The fruit of research and development by Ubisoft, Neo NPC also follows this trend, using artificial intelligence to create characters that interact in real time with players, their environment and other NPCs. A solution that opens up new prospects for dynamic, emergent storytelling.

Ubisoft uses artificial intelligence in a prototype where the NPC guides the player through the mission.
Ubisoft has unveiled a prototype in which the player is guided by an NPC. You can ask them questions about mission status, and they can even spot enemies.

The French publisher uses Inworld AI technologies and their LLM language model to bring its characters to life. In this way, NPCs are able to hold conversations. They interact according to context and environment, producing animations in real time. They can also help the player make strategic choices and advance the mission.

In this demo, set in the Watch Dogs universe, the NPC gives real-time directions to the player. It can detect enemies on screen and unveil the mission’s main objective. Here, the character is completely integrated into the game’s narrative and, naturally, uses hacking and infiltration vocabulary.

Selection: gaming TVs and screens

The future of video games lies in realistic AI

For the time being, Nvidia ACE and Inworld AI’s tools are only in the demo stage, but these technologies are sure to become part of future video games. The future is uncertain for voice actors and actors who regularly embody video game characters. However, AI will bring greater realism and immersion.

These new solutions for developers will greatly facilitate their integration into games. AI is shaking up many sectors, and gaming is set to undergo its own revolution. Evolving stories, adaptive difficulty and richer, more realistic worlds are just some of the promises made by Nvidia ACE.

Share your opinion!

This site uses Akismet to reduce spam. Learn how your comment data is processed.