Starting November 10, inappropriate speech may become rarer in live voice exchanges within the saga. Obligations. With the release of the new album, Call of Duty: Modern Warfare IIIActivision has decided to focus more intensely on a very present scourge in this universe: hate speech, harassment and discrimination.
To do this, the company has decided to join forces with Modulate, a specialist in tools to combat toxic behavior online. In experimental phase since August 30 in North America with Call of Duty: Modern Warfare II AND Call of Duty: Warzonethe “ToxMod” tool will use artificial intelligence to regulate live chats.
Help for “real” moderators
On its site, Modulate explains that the tool works in three stages: first a qualifying phase where voice chat data is analyzed, then an analysis phase where the AI will look at tone, context, and perceived intent in conversations. and finally, a phase called “escalation” that will allow “real” moderators to take action to better manage deviant behaviors.
Therefore, AI will not be able to make the decision to suspend or ban a player based on their behavior, but will be able to inform Activision teams about it.
1 million accounts already sanctioned
For now, AI moderation can only be done for players who chat in English. However, on the official site dedicated to the game it is indicated that other languages ”will follow later”.
Activision also specifies that the “anti-toxicity” moderation measures that are already in place since the launch of COD: Modern Warfare II they have already made it possible to limit voice chat to over a million accounts.
Of this million, 20% of the players would not have reoffended after receiving their first warning. And for the most recalcitrant to the rules, Activision indicates that it has restricted certain functionalities, in particular the possibility of participating in a voice or text chat.
Source: BFM TV
