Modern Warfare 3 to Feature Real-Time Voice Chat Moderation via AI to Combat Toxic Behavior, FAQ Released

modern warfare 3 voice chat moderation

Tired of the toxic behavior of some people in Call of Duty games? Are you cautious in using voice chat in COD games because of how awful some people are? Activision seems to think that this is a big issue, as the company announced that starting with Call of Duty: Modern Warfare 3, the company will implement real-time voice chat moderation.

Teaming with Modulate, Activsion aims to deliver global real-time voice chat moderation. Here’s an explanation of it via the press release:

Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more. This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system.

Players can expect the initial beta of this voice chat moderation tech to begin today in Modern Warfare 2 and Warzone with the launch of Season 5 Reloaded. It will then be followed by a full worldwide release (excluding Asia) alongside Modern Warfare 3 on November 10. Support for it will begin in English with additional languages to follow at a later date.

Alongside this announcement, an FAQ was released as well.

Call of Duty Voice Chat Moderation FAQ

1. Why has Call of Duty added voice chat moderation?
Call of Duty®’s new voice moderation protects players by proactively identifying toxic behavior and enforcing the Call of Duty Code of Conduct, allowing our community to focus on the fun.

Player reporting is still valuable and available in game for players to communicate instances of toxic or disruptive behavior they may encounter; however, voice chat moderation will increase our ability to identify and enforce against bad behavior that has gone unreported.

2. How does Call of Duty’s voice chat moderation work?
Voice Chat Moderation is managed and operated by Activision and uses the AI-powered model ToxMod from Modulate. This system is integrated into select Call of Duty titles (see below) and is managed by Activision. Voice chat is monitored and recorded for the express purpose of moderation.

Call of Duty’s Voice Chat Moderation system is focused on detecting harm within voice chat versus specific keywords. Violations of the Call of Duty Code of Conduct are subject to account enforcement.

3. What types of disruptive behavior are detected?
The Call of Duty Voice Moderation system moderates based on the existing Call of Duty Code of Conduct. Voice chat that includes bullying or harassment will not be tolerated.

4. What are the penalties for violating the Call of Duty Code of Conduct?
Read the Security and Enforcement Policy for information regarding violations and penalties.

5. Which titles and regions are protected by Call of Duty’s voice chat moderation?
Initial beta rollout of the Call of Duty Voice Chat Moderation system will begin in North America only for Call of Duty®: Modern Warfare® II and Call of Duty: Warzone®.

Global rollout, excluding Asia, will begin with Call of Duty®: Modern Warfare® III on November 10, 2023.

6. What languages are supported?
At initial beta rollout, the Voice Chat Moderation System will analyze voice chat in English. Following the global launch, voice chat moderation will expand to additional languages to be announced later.

7. Can I opt-out of voice chat moderation?
Players that do not wish to have their voice moderated can disable in-game voice chat in the settings menu.

8. I received a notification in-game that I was reported. How can I check my status?
The status of reports can be found in your in-game Notifications. Open the menu in the top right corner of the Home screen and navigate to Notifications (the bell icon). From there, select a Report Status Changed notification, then select View Report Status. The status screen will include the reason for and details of a report, as well as the duration of a penalty, if applicable.

9. Does voice chat moderation enforcement happen in real time?
Detection happens in real time, with the system categorizing and flagging toxic language based on the Call of Duty Code of Conduct as it is detected. Detected violations of the Code of Conduct may require additional reviews of associated recordings to identify context before enforcement is determined. Therefore, actions taken will not be instantaneous. As the system grows, our processes and response times will evolve.

10. Does this system ban “trash-talk” from Call of Duty?
The system helps enforce the existing Code of Conduct, which allows for “trash-talk” and friendly banter. Hate speech, discrimination, sexism, and other types of harmful language, as outlined in the Code of Conduct, will not be tolerated.

11. Does AI enforce violations of the Code of Conduct it detects?
Call of Duty’s Voice Chat Moderation system only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model. Activision determines how it will enforce voice chat moderation violations.

Call of Duty: Modern Warfare 3 releases this November 10 on the PS4, PS5, Xbox One, Xbox Series X|S and PC. You can read about its upcoming beta schedule to play it before launch.

Top Games and Upcoming Releases