Call Of Duty's New Anti-Toxicity Voice Detection Is Working, 2 Million Accounts Already Investigated
For the launch of Modern Warfare III, Activision released a new anti-toxicity voice moderation program that could automatically detect toxic speech and take action. Activision now says this program has been very successful so far, leading to more than 2 million accounts having action taken against them for "disruptive" voice chat.
The program rolled out originally in North America in English only, but it has since expanded everywhere in the world (except Asia) and has added support for Spanish and Portuguese moderation. This voice moderation is also up and running for Modern Warfare II and Warzone.
Activision also shared some findings from the voice moderation program so far. The company said only one in five users reported toxic speech when its own systems detected it, which Activision said was "unfortunate." Activision can take action without players reporting bad speech, but direct player reporting remains "critical" to the system working as best it can, the company said.
"To encourage more reporting, we've rolled out messages that thank players for reporting and in the future, we're looking to provide additional feedback to players when we act on their reports," Activision said.
Since the voice chat moderation system debuted, Activision said Call of Duty games using the program have seen an 8% reduction in repeat offenders. Additionally, Activision reported there was about a 50% reduction in players being exposed to "severe" instances of toxic chat since Modern Warfare III launched in November 2023.
Players detected to be using offensive speech could have their voice chat muted, along with other penalties like restriction of certain social features and an inability to use text chat.
Looking ahead, Activision said it plans to continue to beef up the anti-toxicity voice moderation system by continuing to improve the tools and add support for more languages.
"We understand this is ongoing work, but we are committed to working with our community to make sure Call of Duty is fair and fun for all," the company said.
Additionally, Activision updated the language of the Call of Duty code of conduct. Here is the new statement:
"Treat Everyone with Respect
We do not tolerate bullying or harassment, including derogatory comments based on race, gender identity or expression, sexual orientation, age, culture, faith, mental or physical abilities, or country of origin, or the amplification of any person, agenda, or movement that promotes discrimination or violence based on the above.
All members of our community should be treated with dignity and respect.
Communication with others, whether using text or voice chat, must be free of offensive or harmful language. Hate speech and discriminatory language is offensive and unacceptable, as is harassment and threatening another player."
Activision isn't the only gaming company trying to combat toxicity by way of voice moderation. Xbox users can capture toxic voice chat and share it with Microsoft.