Curated by THEOUTPOST
On Fri, 11 Oct, 12:06 AM UTC
4 Sources
[1]
Call of Duty's new chat moderation AI sees major success: 43% drop in toxicity
In brief: Much speculation has arisen over the possible benefits that technologies like generative AI and machine learning might bring to video games. While image reconstruction has proven useful and generative AI has reportedly entered the production process, AI-based analysis has begun to impact voice chat moderation in competitive gaming spaces. Activision Blizzard reports that exposure to toxic voice chat in Call of Duty has declined by 43 percent since the beginning of this year. The publisher credits the recent implementation of AI-based moderation for the results, which have convinced it to expand its use when Call of Duty: Black Ops 6 launches on October 25. The publisher introduced the moderator, ToxMod, when it launched Call of Duty: Modern Warfare III last November. According to ToxMod's website, the system analyzes text transcripts of in-game voice chat and detects keywords based on multiple factors. To tell the difference between trash talk and harassment, ToxMod monitors keywords and reactions to comments, recognizing player emotions and understanding the game's code of conduct. Furthermore, the system estimates the perceived age and gender of the speaker and recipients to understand each conversation's context better. ToxMod can't ban anyone on its own, but it can quickly flag violations for review by human moderators. Activision then decides whether to warn or mute players, only issuing bans after repeated infractions. The number of repeat offenders in Modern Warfare III and Call of Duty: Warzone fell by 67 percent since Activision implemented improvements in June 2024. In July alone, 80 percent of players caught violating voice chat rules didn't re-offend. All regions except Asia currently support ToxMod, and Call of Duty uses the system to moderate voice chat in English, Spanish, and Portuguese. When Black Ops 6 launches, Activision will add support for French and German. Meanwhile, text-based chat and username moderation expanded from 14 to 20 languages in August. Community Sift has been Call of Duty's text moderation partner since the first Modern Warfare reboot game launched in 2019, blocking 45 million messages since Modern Warfare III's November release. Using AI to moderate player behavior is far less controversial than employing the technology for in-game assets. Late last year, AI-generated art appeared in content skins for Modern Warfare III. Amid the historic number of gaming industry layoffs occurring around that time, some fear publishers will try to replace artists with AI models.
[2]
Ahead of Black Ops 6's Launch, Activision Says Call of Duty's AI-Powered Voice Moderation Has Already Had a Massive Impact on Toxicity - IGN
Activision has delivered a report on the work it's done to combat toxicity in Call of Duty, insisting it's already made a huge impact ahead of the launch of Black Ops 6. Call of Duty was for years associated with toxic player behavior, both in voice chat and over text message. But Activision has worked to reverse the franchise's thorny reputation, launching 2023's Modern Warfare 3 with in-game voice chat moderation powered by AI. Activision is using ToxMod from Modulate, which uses AI to identify in real-time and enforce against toxic speech including hate speech, discriminatory language, and harassment. Addressing privacy concerns from the Call of Duty community, Activision has insisted voice chat is only monitored and recorded "for the express purpose of moderation," and "is focused on detecting harm within voice chat versus specific keywords." "The Disruptive Behavior team knows that hype and passion is part of Call of Duty's DNA," Activision said in a fresh progress update. "Voice and text-based moderation tools in Call of Duty don't target our competitive spirit - it enforces against behavior identified in the Call of Duty franchise Code of Conduct, targeting harassment and derogatory language. "Similar to Modern Warfare 3, the Call of Duty Code of Conduct will be visible during the initial in-game flow when players first launch core multiplayer modes in Black Ops 6, asking players to acknowledge the Code of Conduct pillars." Since rolling out an improved voice chat enforcement in June, Call of Duty has seen a combined 67% reduction in repeat offenders of voice-chat based offenses in Modern Warfare 3 and Warzone, Activision added. In July 2024, 80% of players that were issued a voice chat enforcement since launch did not re-offend. Overall exposure to disruptive voice chat continues to fall, Activision said, dropping by 43% since January. At launch, Black Ops 6 will expand its voice moderation to French and German, in addition to English, Spanish, and Portuguese. This is well-timed, given one new feature for Black Ops 6 that's sure to test Call of Duty's AI voice chat moderation to its limit. Black Ops 6 has a new Body Shield feature in multiplayer that lets you grab an enemy and hold them in front of you to soak up bullets while firing off a few rounds of your own. But that's not all: it also enables voice chat between the attacker and the victim, which players certainly had fun with during the game's beta weekends. Call of Duty's text moderation tech, meanwhile, which analyzes text chat traffic in "near" real-time, has blocked over 45 million text-based messages in violation of the Call of Duty Code of Conduct since November 2023. Activision said Call of Duty has also implemented a new analysis system for username reports "to enhance efficiency and accuracy, surfacing critical reports to our moderation team for investigation and action." Activision has used research from the California Institute of Technology (Caltech) to improve its approach here, and worked with researchers from the University of Chicago Booth School of Business to work out ways to better identify and combat disruptive behavior. Activision said it's "actively engaged" in research surrounding disruptive behavior and prosocial activities in gaming.
[3]
Activision weighs in on toxicity ahead of Black Ops 6 launch | Digital Trends
Activision is looking to curb the toxic cesspool that is Call of Duty multiplayer with a few features coming in its upcoming game Black Ops 6. While the development team released its voice and moderation system into Modern Warfare 3 at launch, Black Ops 6 will have a more robust version day one. Voice moderation will be available in five language -- it already supports English, Spanish, and Portuguese, and will have French and German when Black Ops 6 launches. Text tools can monitor even more, with the ability to work with 20 languages, including Japanese, Turkish, and Romanian. Recommended Videos The voice moderation system, called ToxMod, was developed by Modulate.ai and uses machine learning to monitor voice and text chats for bad behavior and potential toxicity in real time. It doesn't outright ban players, nor can it make decisions, but it can flag possible code of conduct violations to a team of real people who can issue bans or suspensions. Call of Duty uses Community Sift, a Microsoft-developed community AI moderation tool, to monitor text chat. "We built ToxMod to help sift through the proverbial haystack and identify the worst, most urgent harms," Modulate CEO Mike Pappas told GamesIndustry.biz. "This allows moderators to prioritize where they can have the most impact." Activision says in a new blog post that since rolling out improved voice chat in June, there's been a "67% reduction in repeat offenders of voice-chat based offenses in Modern Warfare 3 and Call of Duty: Warzone," along with exposure to voice toxicity dropping 43%. It also says that it's blocked over 45 million messages in text since last November. While Call of Duty is arguably the most popular multiplayer shooter series in the world, it's known for community toxicity problems, which has led Activision to release a code of conduct and tools, like ToxMod, to help punish bad actors. The code of conduct says that the studio does not "tolerate bullying or harassment," condemns cheating, and encourages players to report any incidents.
[4]
Black Ops 6 is shaping up to be the least toxic Call of Duty game ever
Activision is doing all it can to reduce the toxicity within the Call of Duty community, and with improved features launching in Black Ops 6, a big reduction could be achieved. Call of Duty is one of the most popular franchises in the history of gaming, and with its popularity, it developed what is renowned in the gaming sector as quite a "toxic community". Activision is aware of this and begun implementing anti-toxic-player features such as its voice and moderation system which is designed to monitor voice and text chats for bad behavior in real-time. As the launch of Black Ops 6 approaches, Activision is planning on rolling out in the latest title a new and improved version of the technology, which is called ToxMod. ToxMod was developed by Modulate.ai and uses artificial intelligence to monitor voice and text chats. If it detects an infringement on the Call of Duty code of conduct, it will flag the event for a human to deliberate on whether it's a bannable offense. According to Modulate CEO Mike Pappas, who spoke to GaminingIndustry.biz, "We built ToxMod to help sift through the proverbial haystack and identify the worst, most urgent harms. This allows moderators to prioritize where they can have the most impact." Notably, Activision has said the technology has already been rolled out in Modern Warfare 3 and Call of Duty: Warzone, and according to the publisher, there has been a "67% reduction in repeat offenders of voice-chat-based offenses in Modern Warfare 3 and Call of Duty: Warzone." Additionally, Activision wrote in a blog post that exposure to voice toxicity has dropped to 43% and it has blocked more than 45 million text messages since last November.
Share
Share
Copy Link
Activision reports a 43% drop in toxic voice chat exposure in Call of Duty games, crediting AI-based moderation. The system will expand to more languages with the upcoming release of Black Ops 6.
Activision Blizzard has reported a significant reduction in toxic behavior within the Call of Duty franchise, thanks to the implementation of AI-based moderation tools. As the gaming industry grapples with issues of online harassment and toxic environments, this development marks a notable step forward in creating safer and more enjoyable multiplayer experiences.
At the heart of this initiative is ToxMod, an AI-powered moderation system developed by Modulate.ai. Introduced with the launch of Call of Duty: Modern Warfare III in November 2023, ToxMod analyzes voice chat transcripts in real-time, detecting potentially harmful content based on multiple factors 12.
The system goes beyond simple keyword detection:
The implementation of ToxMod has yielded remarkable results:
Currently, ToxMod moderates voice chat in English, Spanish, and Portuguese. With the upcoming release of Call of Duty: Black Ops 6 on October 25, support will expand to include French and German 12.
Alongside voice chat moderation, Activision has also bolstered its text-based moderation:
Activision has also implemented a new analysis system for username reports, enhancing efficiency and accuracy in addressing problematic usernames 2.
Activision emphasizes that the moderation tools are designed to target harassment and derogatory language without stifling the competitive spirit that is part of Call of Duty's DNA 2. The company has addressed privacy concerns, stating that voice chat monitoring is solely for moderation purposes and focuses on detecting harmful content rather than specific keywords 2.
As Black Ops 6 prepares for launch, Activision continues to refine its approach to combating disruptive behavior:
With these measures in place, Black Ops 6 is poised to potentially become the least toxic Call of Duty game to date, setting a new standard for online multiplayer experiences in the gaming industry 4.
Reference
Activision introduces an AI-powered anti-cheat system for Call of Duty: Black Ops 6, aiming to detect and ban cheaters within one hour of their first match.
3 Sources
3 Sources
Activision's anti-cheat system, Ricochet, has banned over 19,000 players from Call of Duty: Black Ops 6's ranked play using AI and hourly sweeps, highlighting the ongoing battle against cheaters in online gaming.
5 Sources
5 Sources
Activision admits to using generative AI in Call of Duty: Black Ops 6 and faces criticism for AI-generated artwork in Guitar Hero Mobile promotion, sparking debates about game quality and industry practices.
15 Sources
15 Sources
Activision has allegedly sold an AI-generated cosmetic item in Call of Duty: Modern Warfare 3, raising questions about the use of AI in game development and its potential impact on the industry workforce.
4 Sources
4 Sources
Activision faces backlash for using AI-generated imagery in advertisements for non-existent mobile games, raising questions about the company's marketing strategies and use of artificial intelligence in the gaming industry.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved