Call of Duty voice chat is famously not great. Activision is hoping AI can fix that.
According to a company blog post on Wednesday, the next major COD release will use AI to observe and report on toxic voice chat in multiplayer matches. Specifically, Activision is deploying ToxMod, a bit of AI tech developed by a company called Modulate, to make sure people aren't super racist while gunning each other down in team deathmatch.
SEE ALSO: 'Baldur's Gate 3' will come to Xbox this year, but with one caveatFor what it's worth, ToxMod is interesting if nothing else. As noticed by PCGamer, Modulate claims ToxMod can listen to conversational context to determine whether or not something is truly hate speech.
ToxMod also claims to be able to detect grooming language for white supremacist groups. Modern Warfare III will be by far the biggest game ToxMod has been deployed in so far, so it'll be quite a test. One thing to note, however, is that the software itself does not punish players for hate speech. Instead, it merely reports violations to the Call of Duty code of conduct and humans at Activision take further action.
Of all the different potential uses for AI, this one is perhaps less offensive than most of the rest. After all, it's not trying to replicate or even replace human creativity; instead, it's potentially stopping people from having to listen to hate speech. But that also means there could be real limitations to the tech, or ways to circumvent its filters.
We'll all find out over the next few months, as every Call of Duty player becomes a guinea pig.