Activision is set an AI to monitor voice chats for Call of Duty: Modern Warfare 3.
Call of Duty: Modern Warfare 3 will have less toxic voice chats
Toxic talks can happen almost all games if the game you are playing either PvP or PvE. Some people just don’t like other teammate’s acts and they express their feelings with bad words.
The rebooted Modern Warfare series recently just got it’s official reveal on August 7. After that date, many information regarding Modern Warfare 3 is coming up until the official release date, which is November 10.
Meanwhile, it’s been confirmed that Activision is partnering with a company called Modulate and they will bring the AI voice chat moderation system to Call of Duty: Modern Warfare 3. ToxMod, Modulate’s AI moderation system, detects hate speech, discrimination, and harassment in real time and metes out punishment accordingly.
This is a revolutionary technology that has the potential to make Call of Duty: Modern Warfare 3 online matchmaking a much more wholesome and enjoyable experience.
There will be a beta rollout of ToxMod to existing Call of Duty titles, Modern Warfare 2and Warzone, taking place on August 30. The AI-based moderation system will be released worldwide with Modern Warfare 3 on November 10.
This AI moderation system will be game changer. There are millions of Call of Duty players around the world, and it is impossible to rely on conventional means such as player reporting and manual moderation with such a large community.
Read More: Call of Duty 2023 plans leaked
We hope that the ToxMod rollout is successful and doesn’t result in unfair bans for many Call of Duty players. Speaking about ToxMod, Activision chief technology officer Michael Vance says “This is a critical step forward to creating and maintaining a fun, fair and welcoming experience for all players.” It is clear that Activision is taking toxicity on its servers very seriously.
Call of Duty: Modern Warfare 3 launches on November 10 for PC, PS4, PS5, Xbox One, and Xbox Series X/S.