There has been much speculation about the potential benefits of using technologies like artificial intelligence and machine learning for video games. While image and word reconstruction is useful and generative artificial intelligence has entered the content creation process, artificial intelligence-based analysis has also begun to influence chat management in the competitive gaming space. One of these systems is the ToxMod content monitoring system, the use of which in the Call of Duty game has significantly reduced the verbal violations of players.
Call of Duty’s official blog reports that ToxMod content monitoring artificial intelligence has reduced inappropriate content by 43% since the beginning of this year. With the success of this content monitoring system, it is likely that the same system will be used in Call of Duty: Black Ops 6, which is scheduled to be officially released on October 25.
Activision Blizzard released Call of Duty: Modern Warfare III last November and at the same time unveiled the ToxMod content monitoring system in this game. According to information on the ToxMod website, the system analyzes textual transcripts of in-game voice chat.
Performance of the ToxMod content monitoring system
To distinguish between vandalism and verbal abuse, ToxMod monitors keywords and reacts to comments, detects player emotions, and understands the rules of in-game behavior. Also, in order to better understand the context of each conversation, this system tries to estimate the age and gender of the speaker and the listener of the messages.
The service can’t restrict a player’s access on its own, but it can quickly flag violations for review by human moderators. Activision then decides whether to warn players, mute them, or block their access after repeated violations. The number of offenders in Modern Warfare III and Call of Duty: Warzone has reportedly dropped by around 67% since the implementation of such a system in June 2024.
RCO NEWS