The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Fri, 14 Mar, 8:06 AM UTC
2 Sources
[1]
I played with Nvidia's AI NPC prototypes -- now they're real, and I fear I'll never finish a game again
We've been covering Nvidia's ACE AI NPCs for a while now, and the company has announced it will be officially coming to a few games later this month. The first titles out the gate are life sim inZOI and hack'n'slasher Naraka: Bladepoint, with shooters Black Vultures: Prey of Greed and Fate Trigger coming soon after. Alongside this, we've seen demos of this running as an ally in PUBG, and even giving an enemy boss a voice in MIR5. Now, a big part of me is excited to finally see this become a reality after being a prototype over the past couple of years. But now that it's real, I'm a little nervous about whether I'm going to be playing story-driven games, or hopping into AI therapy sessions. Nvidia's Avatar Cloud Engine (or ACE for short), has been in its prototyping phase for a while now. In fact, the first time I got to try it was all the way back at Computex 2023 for Laptop Mag. Put simply, it's the answer to all those constraining and repetitive non-playable character (NPC) dialogue trees -- simply a bunch of pre-packaged AI models (called NIMs) that can be run on the GPU to give your standard NPC more life than the pre-built options they have today. It starts by developers giving the character a backstory that can be written in pretty plain text (as you can see), which is then used to train a small language model (SLM) to locally handle the conversation side of things on-device. Think of it as an LLM, but...smaller, so it can run without dragging down your GPU's performance too much. After that, other NIMs help with the rest -- text-to-speech giving them a voice and Audio2Face making their lip movements and facial expressions match what is being said. It's a full generative AI solution to NPCs, and the results have been impressive over the past couple of years. Tony Polanco got to see a demo involving talking to the owner of a cyberpunk café, but (not to show off) I played something way cooler. In partnership with Inworld AI, I got to try and uncover the mystery of Covert Protocol -- four characters to talk to in order to try and get to a private hotel suite. The experience was, frankly, insane. I just walked up to someone, and could say whatever I wanted to say. Sometimes they were confused, but most of the time, they would answer whatever random or inane questions I had for them. It got to the point where I spent about 20 minutes talking to the bellboy about his favorite recipes. And I knew there was a whole other thing I needed to do -- understand what the main target Diego cares about through well-worded questions to get myself into that hotel room. But I didn't move on for ages, much to the annoyance of everyone else waiting to have a turn. And therein lies a problem. This reminds me of my time being a Dungeon Master in Dungeons & Dragons... Bit of a weird comparison, but let me explain. You see, I have to describe the location to my players -- tell them what is currently surrounding their characters. To help them on their journey, I try to place emphasis on the things they should interact with, from a certain book on a book shelf or a half-written note on a desk. But instead, for reasons I cannot explain, there's a particular Half-Elf Bard in my campaign that seems dedicated to just investigating shoes and nothing else -- to the point that I've had to stay up for hours writing a whole side campaign to keep him busy. And that is where Nvidia ACE could be problematic. You read about it from above, as I was too interested in finding out the hotel clerk's entire life story to even realize I needed to talk to the main suspect sitting by the fire. Curiosity is undeniable, and with this expanded level of conversation, it's hard not to explore how far you can take a chat with someone. The potential to get sidetracked could be huge, and I know that some of my favorite moments in games are indeed when things go slightly off the rails. But can it get in the way of the structure of a story if all the NPCs are loaded with lore? I think so. Given it seems like we're taking baby steps into the world of AI NPCs, it's going to take a bit of time to get to the point of an inconsequential character taking over your entire game. Not only that, but developers can wire in barriers of how far a conversation could go. But as time goes on, and the amount of AI compute power increases in those GPUs doing the calculations, we could see ACE's capabilities continue to grow. Seeing it come to a life sim like inZOI gets me pretty damn excited -- keen to see what chaos I can bring to this virtual world. And to watch it utilized for an agentic ally in Naraka, and a companion in the upcoming Black Vultures is certainly an interesting way to keep these games moving along. All I'm saying is that when traditional dialogue trees are torn down in favor of essentially an LLM for each NPC, you may find some players getting a little lost on their way to finding what they need to move on.
[2]
Nvidia's AI characters are about to transform how we see video games
AI characters are coming to video games perhaps sooner than many people imagined. Just after it emerged that Sony was testing an AI-generated Aloy in Horizon Forbidden West, Nvidia has announced that its ACE AI game characters will make their debuts in real games this month. With thoughts and behaviour governed by an on-device Nvidia ACE Small Language Model, autonomous game characters will be launched in Krafton's life simulation game InZOI (a kind of AI Sims 4 for 2025) and in the mobile version of 24 Entertainment's free-to-play wuxia action battle royale game Naraka: Bladepoint. And Nvidia says that more developers will quickly following suit. inZOI players will be able to transform the simulated city's NPCs, called Zois, into ACE autonomous game characters with higher levels of artificial intelligence by activating a 'Smart Zoi' experimental setting. We're told that enhanced Zois will adapt and respond more realistically and intelligently to their environment based on their personalities. For instance, a Smart Zoi with a considerate personality might decide to help a lost character with directions. Inevitably, some are already wondering if SmartZOI's will be able to commit crime. After all, we're still waiting for GTA 6. Nvidia says players will also be able to observe their Smart Zoi's inner thoughts in real-time as they go about their day, and influence them directly with a free-form text input to modify their actions and life goals. The innovation, whether you're a fan or not, has the power to change how we conceive video games, making them potentially richer and more surprising as AI NPCs, teammates or enemies act freely. Applying this to an open-world game could open endless possibilities. In Naraka: Bladepoint Mobile, ACE-powered teammates will help players battle enemies and hunt for loot. The AI Teammate can join your party in Free Training, battling alongside you, finding you specific items that you need, swapping gear, offering suggestions on skills to unlock, and making plays that'll help you achieve victory. ACE characters will be added to Naraka: Bladepoint Mobile from 27 March, while inZOI will launch in early access on 28 March at 00:00 UTC. Only gamers with GeForce RTX GPUs will be able to use Smart Zoi feature. Nvidia ACE is a suite of RTX-accelerated digital human technologies designed to bring game characters to life with generative AI. The tech continues to advance, now incorporating new vision and audio language models, enhancing facial animation with new model architectures, and accelerating game development with updated ACE plugins for Maya and Unreal Engine for offline authoring. Two more upcoming games using the tech are Black Vultures: Prey of Greed and Fate Trigger. In the former, an FPS being developed by Wemade's THISMEANSWAR Studio, ACE will power an in-game AI companion. Saroasis Studios' Fate Trigger, a 3D anime-style, hero-based tactical shooter made on Unreal Engine 5, is using ACE's Audio2Face-3D to generate real-time facial animations and lip-sync for all dialogue.
Share
Share
Copy Link
Nvidia's Avatar Cloud Engine (ACE) AI NPCs are set to launch in actual games, promising more interactive and dynamic gaming experiences while raising questions about potential impacts on gameplay and storytelling.
Nvidia's Avatar Cloud Engine (ACE) AI-powered non-player characters (NPCs) are set to make their official debut in real games this month, marking a significant milestone in the evolution of interactive gaming experiences 12.
The first games to incorporate ACE technology include:
Nvidia's ACE technology combines several AI models to create more lifelike and responsive NPCs:
These models run locally on the GPU, ensuring minimal performance impact while providing enhanced NPC interactions 1.
The introduction of ACE AI NPCs could significantly transform the gaming landscape:
While the technology shows promise, some potential issues have been raised:
As AI compute power increases, the capabilities of ACE technology are expected to grow, potentially revolutionizing how players interact with game worlds and characters 1. This could lead to more dynamic and unpredictable gaming experiences, blurring the lines between scripted content and AI-generated interactions.
Game developers are already exploring ways to integrate ACE technology into various genres, from life simulations to shooters 12. As the technology matures, it's likely that more studios will adopt AI-powered NPCs to create richer, more interactive game worlds.
Nvidia unveils AI-powered NPCs at CES 2025, sparking debate about the future of gaming and the role of human creativity in game development.
3 Sources
3 Sources
Game developers are exploring the use of AI to create more interactive and lifelike non-player characters (NPCs) in video games. This technological advancement promises to enhance player immersion and create more dynamic gaming experiences.
7 Sources
7 Sources
Krafton and Nvidia collaborate to introduce AI-driven companions called Co-Playable Characters (CPCs) for PUBG and inZOI, promising enhanced gameplay experiences through natural language interaction and adaptive behavior.
7 Sources
7 Sources
NVIDIA's Avatar Cloud Engine (ACE) technology is set to debut in an actual game, promising more realistic and engaging NPC interactions. The upcoming mecha combat game 'Mecha Break' will showcase this groundbreaking AI-powered feature.
2 Sources
2 Sources
As generative AI makes its way into video game development, industry leaders and developers share their perspectives on its potential impact, benefits, and challenges for the future of gaming.
3 Sources
3 Sources