Games are Becoming Even More Realistic: How NVIDIA ACE Bring Virtual Characters to Life with AI

  • A new experience for game players - now with real-time conversations. 
  • Just hold down a button, ask, and…get the answer! 
How NVIDIA ACE Bring Virtual Characters to Life with AI

Nvidia CEO Jensen Huang recently provided a tantalizing glimpse of the future at Computex 2023 in Taipei. Combining gaming and artificial intelligence, Nvidia showcased a visually stunning rendering of a cyberpunk ramen shop where players can engage in real-time conversations with the virtual proprietor.

Gov NVIDIA ACE Bring Virtual Characters to Life with AI

Unlike traditional dialogue options in games, Nvidia envisions a more immersive experience. Players can simply hold down a button, speak with their own voice, and receive responses from in-game characters. This innovative approach is being hailed as a "peek at the future of games." What and how AI responds to natural language is simply amazing.

"Not only will AI contribute to the rendering and the synthesis of the environment, AI will also animate the characters," Huang said. "AI will be a very big part of the future of video games."

Nvidia, in collaboration with partner Convai, developed this demo as a means to showcase and promote the tools that were used to create the video. Particularly noteworthy is the suite of middleware called Nvidia ACE (Avatar Cloud Engine) for Games, capable of running locally and in the cloud. This comprehensive ACE suite includes NeMo tools for deploying large language models (LLMs), as well as Riva speech-to-text and text-to-speech capabilities, among others. The demo version itself is built on Unreal Engine 5 with a lot of ray tracing. Thanks to this approach, the visual effects look simply stunning!

During a Computex pre-briefing, Jason Paul, Nvidia's vice president of the GeForce platform, shared that the technology can scale to more than one character at a time and could theoretically even allow NPCs to talk to each other. However, he noted that testing has not yet been performed.

It remains to be seen whether developers will adopt the entire ACE toolkit showcased in the demo. Nevertheless, titles like S.T.A.L.K.E.R. 2 Heart of Chernobyl and Fort Solis are set to incorporate a component known as "Omniverse Audio2Face." This feature aims to synchronize facial animations of 3D characters with the speech of their voice actors, enhancing the overall immersion in these games.

Read on how Nvidia’s advancements alter the future.

To remind, in February, Nvidia presented a "supercomputer with artificial intelligence".

Nataliia Huivan
Nataliia Huivan
Professional author in IT Industry

Author of articles and news for Atlasiko Inc. I do my best to create qualified and useful content to help our website visitors to understand more about software development, modern IT tendencies and practices. Constant innovations in the IT field and communication with top specialists inspire me to seek knowledge and share it with others.

Share your thoughts in the comments below!

Have any ideas or suggestions about the article or website? Feel free to write it.

Any Questions?

Get in touch with us by simply filling up the form to start our fruitful cooperation right now.

Please check your email
Get a Free Estimate