There has been much speculation about the poteial benefits of using technologies like artificial ielligence and machine learning for video games. While image and word reconstruction is useful and generative artificial ielligence has eered the coe creation process, artificial ielligence-based analysis has also begun to influence chat manageme in the competitive gaming space. One of these systems is the ToxMod coe monitoring system, the use of which in the Call of Duty game has significaly reduced the verbal violations of players.
Call of Duty’s official blog reports that ToxMod coe monitoring artificial ielligence has reduced inappropriate coe by 43% since the beginning of this year. With the success of this coe monitoring system, it is likely that the same system will be used in Call of Duty: Black Ops 6, which is scheduled to be officially released on October 25.
Activision Blizzard released Call of Duty: Modern Warfare III last November and at the same time unveiled the ToxMod coe monitoring system in this game. According to information on the ToxMod website, the system analyzes textual transcripts of in-game voice chat.

Performance of the ToxMod coe monitoring system
To distinguish between vandalism and verbal abuse, ToxMod monitors keywords and reacts to commes, detects player emotions, and understands the rules of in-game behavior. Also, in order to better understand the coext of each conversation, this system tries to estimate the age and gender of the speaker and the listener of the messages.
The service can’t restrict a player’s access on its own, but it can quickly flag violations for review by human moderators. Activision then decides whether to warn players, mute them, or block their access after repeated violations. The number of offenders in Modern Warfare III and Call of Duty: Warzone has reportedly dropped by around 67% since the implemeation of such a system in June 2024.



