Curated by THEOUTPOST
On Thu, 16 Jan, 12:05 AM UTC
3 Sources
[1]
Why Nvidia's big bet on AI NPC's has me unconvinced
The future of gaming is rife with lifelike non-player characters powered by generative AI models -- or that's what Nvidia thinks, at least. At CES 2025, Nvidia released a series of demos for new AI tools and features for gaming powered by Nvidia's RTX 50-series GPUs. They range from AI audio editing for streamers to unpredictable "AI bosses," but the one that stands out the most to me is AI NPCs that are meant to supplant human-written dialogue with a chatbot. Nvidia is clearly betting big on AI becoming omnipresent in gaming, especially generative AI, but whether gamers actually follow suit on that vision is another question entirely. I'm personally betting the answer is no, and here are a few major reasons why. Rumors and hype around "AI NPCs" in games have been bubbling for years now, but they have yet to truly materialize. Nvidia is trying to change that with an AI model called Nvidia ACE. In a demo for ACE, Tao Zhang, CEO of TiGames, plays an tech demo for a "future project" called ZooPunk, where he interacts with an NPC running on Nvidia ACE and uses it to customize his ship. Zhang is able to talk to the NPC using his voice, rather than scripted dialogue options, and it replies with AI-generated lines. At first glance, that sounds cool, but unfortunately, the way it actually sounds in the demo is less impressive. The AI NPC's dialogue is stiff, robotic, and hollow. It reminds me more of an automated voice mailbox message than an actual human. While it's a neat concept, it's a far cry from what human voice actors are capable of. An upcoming indie title, Dead Meat, is another great example. This game has players interrogate suspects in a murder mystery. Players can ask the NPCs any question they want and the NPCs respond with AI-generated dialogue. On paper, this seems like it could be a clever way to use AI for NPCs. However, the generated dialogue shown in a demo video is firmly in the uncanny valley. It definitely still sounds like an AI, besides leaving me wondering how the game maintains the consistency needed for something as complicated as a murder mystery. Of course, it's certainly possible this technology will improve over the next few years. Maybe the lines will sound more natural or the AI-generated voices will sound less cold and robotic. Even if that does end up happening, it still might not be enough to make AI NPCs the new norm. On a more practical level, I'm doubtful about how useful unscripted NPCs are in many types of games, especially RPGs and story-driven titles. NPCs are often how players get quests, learn important lore, or experience key plot points. The specific dialogue lines written for an NPC can be crucial to the story or even the basic functions of a game. Sometimes NPCs' dialogue can be so well-written that it has a real emotional impact on the player. Star Wars: Jedi: Fallen Order is a prime example for me. It's one of my favorite games, partly because of the meaningful interactions Cal Kestis has with the cast of NPCs. Another great example is Tales of Kenzera: ZAU, in which emotional bonds and arcs with NPCs are a core pillar to the game's phenomenal, meaningful ending. NPC interactions, and their emotional weight, are the result of intentional writing decisions, not something that can be randomly generated by an AI (or effectively delivered by one). Is it really worth losing that to have slightly less repetitive conversations with shopkeeper NPCs? I don't think so. One of my favorite games is The Outer Worlds (which is set to get a sequel this year). Whenever I recommend it to a friend, I always mention how much I love one of the companion NPCs, Parvati. It's not just me, either. Parvati is widely beloved by fans of the game thanks to her lovable personality and surprisingly deep, unique character arc. Notably, Parvati is not AI-generated. NPCs like her play a major role in the fabric of their games and play a major role in the communities that rally around those games. Whenever I shower praise on Parvati to a friend, it's partly because I'm excited for them to experience her story, too. That kind of shared experience could disappear in a world where NPCs run on AI-generated dialogue. Two people could witness drastically different NPCs that have the same name and face but little else in common, not to mention completely different stories. That leaves less ground for a shared love and appreciation for elements of the game or story. Plus, you lose the intentional artistic decisions that go into crafting an NPC's character arc and thoughtfully acting out their lines. Video games are a creative art form. AI can be a tool in the artistic tool box, but it's not a replacement for the artist. Using AI to streamline game development or optimize hardware makes sense and has a net positive impact on the gaming experience. However, replacing nuanced, engaging NPCs with hollow AI chatbots, in my opinion, has a net negative impact and could strip away a vital part of any great game.
[2]
NVIDIA's AI NPCs are a nightmare
The rise of AI NPCs has felt like a looming threat for years, as if developers couldn't wait to dump human writers and offload NPC conversations to generative AI models. At CES 2025, NVIDIA made it plainly clear the technology was right around the corner. PUBG developer Krafton, for instance, plans to use NVIDIA's ACE (Avatar Cloud Engine) to power AI companions, which will assist and banter with you during matches. Krafton isn't just stopping there -- it's also using ACE in its life simulation title InZOI to make characters smarter and generate objects. While the use of generative AI in games seems almost inevitable, as the medium has always toyed with new methods for making enemies and NPCs seem smarter and more realistic, seeing several NVIDIA ACE demos back-to-back made me genuinely sick to my stomach. This wasn't just slightly smarter enemy AI -- ACE can craft entire conversations out of thin air, simulate voices and try to give NPCs a sense of personality. It's also doing that work locally on your PC, powered by NVIDIA's RTX GPUs. But while all of that that might sound cool on paper, I hated almost every second I saw the AI NPCs in action. TiGames' ZooPunk is a prime example: It relies on NVIDIA ACE to generate dialog, a virtual voice and lip syncing for an NPC named Buck. But as you can see in the video above, Buck sounds like a stilted robot with a slight country accent. If he's supposed to have some sort of relationship with the main character, you couldn't tell from the performance. I think my visceral aversion to NVIDIA's ACE-powered AI comes down to this: There's simply nothing compelling about it. No joy, no warmth, no humanity. Every ACE AI character feels like a developer cutting corners in the worst way possible, as if you're seeing their contempt for the audience manifested a boring NPC. I'd much rather scroll through some on-screen text, at least I wouldn't have to have conversations with uncanny robot voices. During NVIDIA's Editor's Day at CES, a gathering for media to learn more about the new RTX 5000-series GPUs and their related technology, I was also underwhelmed by a demo of PUBG's AI Ally. Its responses were akin to what you'd hear from a pre-recorded phone tree. The Ally also failed to find a gun when the player asked, which could have been a deadly mistake in a crowded map. At one point, the PUBG companion also spent around 15 seconds attacking enemies while the demo player was shouting for it to get into a car. What good is an AI helper if it plays like a noob? Poke around NVIDIA's YouTube channel and you'll find other disappointing ACE examples, like the basic speaking animations in the MMO World of Jade Dynasty (above) and Alien: Rogue Incursion. I'm sure many devs would love to skip the chore of developing decent lip syncing technology, or adopting someone else's, but for these games leaning on AI just looks awful. To be clear, I don't think NVIDIA's AI efforts are all pointless. I've loved seeing DLSS get steadily better over the years, and I'm intrigued to see how DLSS 4's multi-frame generation could improve 4K and ray-tracing performance for demanding games. The company's neural shader technology also seems compelling, in particular its ability to apply a realistic sheen to material like silk, or evoke the slight transparency you'd see from skin. These aren't enormous visual leaps, to be clear, but they could help deliver a better sense of immersion. Now I'm sure some AI boosters will say that the technology will get better from here, and at some undefinable point in the future, it could approach the quality of human ingenuity. Maybe. But I'm personally tired of being sold on AI fantasies, when we know the key to great writing and performances is to give human talent the time and resources to refine their craft. And on a certain level, I think I'll always feel like the director Hayao Miyazaki, who described an early example of an AI CG creature as, "an affront to life itself." AI, like any new technology, is a tool that could be deployed in many ways. For things like graphics and gameplay (like the intelligent enemies in F.E.A.R. and The Last of Us), it makes sense. But when it comes to communicating with NPCs, writing their dialog and crafting their performances, I've grown to appreciate human effort more than anything else. Replacing that with lifeless AI doesn't seem like a step forward in any way.
[3]
These Nvidia AI NPCs Are Just as Obnoxious as Real Players
Nvidia and Krafton's PUBG Ally is a bad friend and worse co-op buddy, but I prefer it to AI which doesn't know the difference between orange and purple. The RTX 50-series GPUs may have been the stars of the show this CES, but AI is Nvidia’s real bread and butter. In Nvidia’s perfect world, the PCs and games would all be subsumed by AI that runs directly on your PC without cloud processing. If there were anything I wish AI could do, it would be to stop talking so much. Instead, what I saw walking through a closed suite of Nvidia demos was a rough first draft of AI concepts. Some of its planned models could make sense with more fine-tuning. The company is working on a text-to-body motion framework for developers, though we have to see how professional animators react to the tool. Additional AI lip-syncing and “autonomous enemies†are supposed to transform the typical boss fightâ€"where you learn then exploit a predicable patternâ€"into something more spontaneous for both enemies and players. The generative AI made to renovate your classic NPC or enemy AI in games doesn’t quite stick the landing. Take, for instance, the AI “Ally†Nvidia helped create for the battle royale game PUBG: Battlegrounds. It was a silly trailer for what was essentially your usual co-op AI, but one you can issue orders to with your voice. I asked it to talk like a pirate. The AI responded, “Ah, the pirate life. Let’s hope we don’t end up walking the plank. Stick with me, and we’ll find a buggy soon.†It’s worse to try and ascribe some personality to the AI that will talk back to you rather than the unblinking NPC buddy you can yell at for getting in your way every few seconds. I found the AI was pretty slow at helping me acquire a weapon. When I inevitably got shot and crawled toward him, pleading with him to pick me up, the bot shot absently at the enemy, then ran past into a neighboring house. I only survived because the developers offered invincibility if you got shot during the demo. The PUBG Ally is the same asinine experience of playing co-op with a bot, except this one always talks back to you in complete sentences. Krafton also uses Nvidia’s tech to craft inZOI, a Sims-type life simulator that puts chatbots in characters' heads. Despite the AI supposedly planning and making life choices for the in-game characters, it still looked like a dull version of The Sims, lacking most of the charm of those games. I see the vision; it's just not there yet. We’ve previously tried Nvidia’s ACE, or its autonomous game NPCs, at the last CES and later in 2024. Based on our previous demos, the generated voice lines are incredibly stiff and sound like they're being read from a parody book of cyberpunk and detective genre cliches. Nvidia didn’t have an update for its AI NPCs this time; instead, the company showed off another demo in a demo of an upcoming game called ZooPunk. It’s supposedly about a rebel bunny with two laser swords on his back. How can anybody make that uncool? By adding scratchy AI voice lines to its characters. Our demo saw Nvidia ask a character in-game to change the color of his spaceship from beige to purple. “No problem. Let’s get started,†the AI replied. Instead of purple, it made the ship orange. On the second try, the AI managed to find the right end of the color spectrum. Then, Nvidia asked it to change the decal on the ship to a narwhal fighting a unicorn. Instead, the AI-generated an image that seemed to show a narwhal doing a Lady and the Tramp spaghetti scene with a ball-shaped horned gelding. The real showcase of Nvidia’s AI was in its desktop companion apps. First was its G-Assist, a chatbot you can use to automatically manipulate settings in Nvidia’s apps and calculate the best in-game settings for your PC’s specs. The chatbot can even give you a graph of your GPU and CPU performance levels over time, and it should run directly from your PC rather than the cloud. I would prefer it if it could adjust your settings for you rather than having to turn the knobs yourself. Still, at the very least, it is an intriguing use for AI. But AI can’t just be intriguing. It needs to be wild. Otherwise, who cares? There were demos for an AI head made for streamers that were surprisingly rude to users. However, the real star was a talking mannequin head sitting like a gargoyle on your home desktop screen. it’s using the company’s neural faces, lip-syncing, text-to-voice, and skin models to try and make something “realistic.†In my estimation, it still lands on the wrong side of the uncanny valley. The idea is you could talk to it like you would Copilot on Windows, but you can also drop files onto its forehead for the AI to read and regurgitate information from. Nvidia demoed this with an old PDF of the booklet that came with Doom 3 (as if to remind us when playing games on PC was as simple as buying and installing a disc). Based on the booklet, it could offer a rundown of the game’s story. You can add a face to AI, but it will still just be a text-generation machine that doesn’t comprehend what it says. Nvidia mentioned it could add G-Assist to its animated assistant, but that wouldn’t make it any speedier, faster, or more reliable. It would still be a big black hole of AI taking up the bottom of your desktop, staring at you with lifeless eyes and an empty smile.
Share
Share
Copy Link
Nvidia unveils AI-powered NPCs at CES 2025, sparking debate about the future of gaming and the role of human creativity in game development.
At CES 2025, Nvidia showcased its latest innovation in gaming technology: AI-powered non-player characters (NPCs) driven by their Avatar Cloud Engine (ACE) 1. This technology, designed to run on Nvidia's RTX 50-series GPUs, aims to revolutionize gaming by creating more dynamic and interactive NPCs. However, the announcement has sparked a heated debate within the gaming community about the potential benefits and drawbacks of this AI integration.
Nvidia's ACE technology promises to bring unprecedented levels of interactivity to gaming NPCs. In a demo for a future project called ZooPunk, players could engage in voice conversations with AI-generated characters, moving beyond traditional scripted dialogue options 1. The technology also powers AI companions in games like PUBG, where the AI can assist and banter with players during matches 2.
Despite the ambitious vision, early demonstrations of the technology have faced significant criticism:
Unnatural Dialogue: Many observers noted that the AI-generated dialogue sounded stiff, robotic, and lacked the nuance of human-written lines 1.
Lack of Emotional Depth: Critics argue that AI-generated NPCs may struggle to deliver the emotional impact and character depth that skilled writers and voice actors can provide 1.
Inconsistent Performance: In demos, AI companions sometimes failed to respond appropriately to player commands or game situations 3.
Potential Loss of Shared Experiences: There are concerns that AI-generated NPCs could lead to disparate player experiences, potentially diminishing the shared narratives that often unite gaming communities 1.
The introduction of AI NPCs raises questions about the future of game development:
Creative Control: Some worry that relying on AI for NPC interactions could reduce the creative control that writers and designers have over game narratives 1.
Development Efficiency: Proponents argue that AI NPCs could streamline game development, potentially allowing for more expansive and dynamic game worlds 2.
Balance of Technology and Art: The debate highlights the ongoing challenge of balancing technological advancements with the artistic elements of game design 2.
While the current implementation of AI NPCs has received mixed reviews, many acknowledge that the technology is in its early stages. As Nvidia and game developers continue to refine these systems, it remains to be seen how AI will ultimately shape the future of gaming experiences and whether it can overcome the current limitations to deliver truly engaging and emotionally resonant characters 3.
Reference
[1]
[2]
Nvidia's Avatar Cloud Engine (ACE) AI NPCs are set to launch in actual games, promising more interactive and dynamic gaming experiences while raising questions about potential impacts on gameplay and storytelling.
2 Sources
2 Sources
Game developers are exploring the use of AI to create more interactive and lifelike non-player characters (NPCs) in video games. This technological advancement promises to enhance player immersion and create more dynamic gaming experiences.
7 Sources
7 Sources
Krafton and Nvidia collaborate to introduce AI-driven companions called Co-Playable Characters (CPCs) for PUBG and inZOI, promising enhanced gameplay experiences through natural language interaction and adaptive behavior.
7 Sources
7 Sources
As generative AI makes its way into video game development, industry leaders and developers share their perspectives on its potential impact, benefits, and challenges for the future of gaming.
3 Sources
3 Sources
AI is transforming the gaming industry, from game creation to hardware advancements. This story explores how AI is being used to develop PC games and Nvidia's latest AI-focused innovations.
2 Sources
2 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved