NVIDIA’s ACE will revolutionize how you play games on Windows

Gaming is about to get a lot more real

by Alexandru Poloboc
Alexandru Poloboc
Alexandru Poloboc
News Editor
With an overpowering desire to always get to the bottom of things and uncover the truth, Alex spent most of his time working as a news reporter, anchor,... read more
Affiliate Disclosure
  • Gaming on Windows is about to become more and more fun and interactive. 
  • NVIDIA presented ACE during a recent keynote, where reveals were made.
  • Using this new technology, NPCs can talk to player characters in real time.
nvidia
XINSTALL BY CLICKING THE DOWNLOAD FILE
Easily get rid of Windows errors Fortect is a system repair tool that can scan your complete system for damaged or missing OS files and replace them with working versions from its repository automatically. Boost your PC performance in three easy steps:
  1. Download and Install Fortect on your PC.
  2. Launch the tool and Start scanning
  3. Right-click on Repair, and fix it within a few minutes
  • 0 readers have already downloaded Fortect so far this month

You might now know it yet, but during yesterday’s keynote address at Computex in Taiwan, NVIDIA’s CEO, Jensen Huang, showed off a new technology for game developers.

And yes, this new tech involves using AI and cloud servers, which has become the company’s big new focus.

Without spoiling anything just yet, just know that this new tech will actually make use of Microsoft’s Chat GPT and Google’s BARD AI software.

NPC interactions will become increasingly real

This new technology is called NVIDIA ACE (Avatar Cloud Engine) for Games. It’s not the shortest of names, as names go, but you can just use the acronym.

In truth, it will allow developers to put in NPCs that can talk to player characters in real-time, but with dialogue that is non-scripted and powered by AI chatbots similar to ChatGPT, Bing Chat, and Bard.

Furthermore, you will also be allowed the NPC facial animations to match the non-scripted dialogue. How’s that for some 2023 tech?

With it, you can bring life to NPCs through NeMo model alignment techniques. First, employ behavior cloning to enable the base language model to perform role-playing tasks according to instructions.

To further align the NPC’s behavior with expectations, in the future, you can apply reinforcement learning from human feedback (RLHF) to receive real-time feedback from designers during the development process.

The company also stated that these AI NPCs are controlled by NeMo Guardrails, which will hopefully keep them from saying weird or even offensive things to gamers.

As you can see above, NVIDIA showed off a brief demo of ACE that was posted on YouTube. It was created in Unreal Engine 5 with ray tracing enabled and MetaHuman tech for NPC character models.

NVIDIA also used technology from a startup company called Convai that’s creating AI characters in games.

Keep in mind that Convai used NVIDIA Riva for speech-to-text and text-to-speech capabilities, NVIDIA NeMo for the LLM that drives the conversation, and Audio2Face for AI-powered facial animation from voice inputs.

The AI NPC shown in the demo is definitely not perfect. His speech pattern seemed very stilted and, dare we say, artificial in nature.

That being said, it’s more than likely that speech patterns will be improved and become more natural in the months and years ahead.

However, NVIDIA did not state when ACE for Games will be available to game developers, so let’s not get too ahead of ourselves.

Audio2Face technology, which matches the facial animation to a game character’s speech, is being used in two upcoming games: the third-person sci-fi game Fort Solis, and the long-awaited post-apocalyptic FPS sequel S.T.A.L.K.E.R. 2: Heart of Chornobyl.

What are your thoughts on this recent news from NVIDIA? Share your opinions with us in the comments section below.

This article covers:Topics: