Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Games

Call of Duty Will Use AI To Moderate Voice Chats 48

Activision has partnered with a company called Modulate to moderate voice chats using an AI technology called ToxMod. According to The Verge, the tool "will work to identify behaviors like hate speech, discrimination, and harassment in real time." From the report: ToxMod's initial beta rollout in North America begins today. It's active within Call of Duty: Modern Warfare II and Call of Duty: Warzone. A "full worldwide release" (it does not include Asia, the press release notes) will follow on November 10th with the release of Call of Duty: Modern Warfare III, this year's new entry in the franchise. Modulate's press release doesn't include too many details about how exactly ToxMod works. Its website notes that the tool "triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context."

The company's CEO said in a recent interview that the tool aims to go beyond mere transcription; it takes factors like a player's emotions and volume into context as well in order to differentiate harmful statements from playful ones. It is noteworthy that the tool (for now, at least) will not actually take action against players based on its data but will merely submit reports to Activision's moderators.
This discussion has been archived. No new comments can be posted.

Call of Duty Will Use AI To Moderate Voice Chats

Comments Filter:
  • by TwistedGreen ( 80055 ) on Wednesday August 30, 2023 @08:38PM (#63810820)

    What a colossal waste of computing resources... and good luck appealing it when you get banned for no reason.

  • Goodbye (Score:2, Funny)

    by Lije Baley ( 88936 )

    Well, that's pretty much the end of COD then.

    • This. The entire COD player base could be banned by "analyzing" their emotional state. Forget just being a teenager, the whole point of the game is to kill others. That alone is one hell of an emotional thing to filter out as "normal" behavior, but now you're also trying to flag "hate speech" (What counts as this?) "discrimination" (Again, the whole point of the game is to kill others....), and "harassment." (Does spawn camping count? How about constantly targeting the weakest player on the opposing team?)
      • I look forward to the goofy new slurs COD players will conjure. Gyat, kill that cheugy bussy!
      • Re:Goodbye (Score:4, Informative)

        by AmiMoJo ( 196126 ) on Thursday August 31, 2023 @04:19AM (#63811412) Homepage Journal

        Great post until it got into the conspiracy theory part.

        They are in this to make money. They know that being called the n word or a homosexual (in the pejorative sense) is not a good experience for players, especially younger ones. They want people to use voice chat because it hooks them in, builds up friendships and communities, and keeps them coming back.

        It's a running joke that voice chat on COD is often just a 12 year old screaming obscenities at you. They calculated that having an LLM babysit them is more profitable, that's all.

        • They are in this to make money. They know that being called the n word or a homosexual (in the pejorative sense) is not a good experience for players, especially younger ones. They want people to use voice chat because it hooks them in, builds up friendships and communities, and keeps them coming back.

          It's a running joke that voice chat on COD is often just a 12 year old screaming obscenities at you. They calculated that having an LLM babysit them is more profitable, that's all.

          A lot of players enjoy the shit talk. Taking it away is likely to negatively affect continued interest in the game / platform.

        • They are in this to make money.

          No, they are not. LLMs require data-center money to operate and maintain. Activision isn't charging for this service, nor do they charge for COD multiplayer, which makes it a cost-center. They have to take the money to pay for the LLM from their own profits. To say nothing about the initial creation costs. Which again, using a generic LLM made for real world interactions isn't going to work well in the realm of policing chat in a game where the whole point is to kill others and celebrate doing so. They'd n

      • "How about constantly targeting the weakest player on the opposing team?)"

        I guess in other games, 'kill the healer first' will be banned as hate-speech too.

  • of the Call of Duty experience. People who don't like it should just be politely reminded that there are mute buttons for each player plus a settings option to turn the voice chat off altogether. Nobody'd being forced to listen to it.
    • Call of Duty multiplayer is a *team* game. You kinda need to be able to communicate with your teammates to play it.

  • Blizzard already announced their Defense Matrix system that was intended to record and police voice chat in Overwatch 2. It doesn't work.

    Blizzard is part of Activision.

    • by gweihir ( 88907 )

      Well, except for very specialized areas, "AI" does not work. So no surprise there.

      • Someone somewhere might be able to create an AI that can record millions of sketchy voice chat lines, process them, and filter for banwords. Blizzard/Activision has yet to demonstrate that capability.

  • You have a new thing to play with, they are the worst people to playtest a voicechat system, they will subvert it but not tell you ...

  • 'member when going online games used to just show a warning saying you're dealing with real people who might say anything and that it is possible to turn the voice chat off and that was enough because we weren't all girlymanbabies.
  • Technology that could be used to automatically detect and censor abuse can also be used to automatically detect and censor other speech/chat, like criticism of government or specific policies. As such, this is creating turnkey dictatorship.
  • Thank FSM naughty words won't be used as I'm eviscerating people.
  • I strongly suggest deploying AI chat moderation to regulate the proposals offered by corporate executives to detect and limit the psychotic bullshit they suggest.
  • Just use something external, such as Discord. Using in-game voice only leads to drama.

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...