Toxicity is prevalent in competitive games like the third-person MOBA Predecessor. Developer Omeda Studios knows this, so a new chat moderation system is implemented to curb this issue.
Tackling Toxicity
It is no secret that MOBA games have pretty toxic communities. In League of Legends, for example, toxicity was so severe that Riot Games had to implement a feature to let players opt out of receiving chat messages from others by inputting certain commands.
Omeda Studios also wanted to curb toxicity in Predecessor, but the company used a different approach - using AI to help moderate in-game chat.
In a recent blog post, the studio said that the vast majority of moderation in the game was done manually. The company admitted that this method was highly unsustainable and inefficient.
However, after testing various anti-toxicity solutions, the company decided to use an industry-leading AI tool, now fully implemented in Patch v0.9.
So, how does it work? This AI tool has been trained to recognize a huge list of derogatory and offensive words and phrases - ranging from threats, insults, and slurs, among others. When the AI detects a player using such a language, it will flag them automatically and hand out swift penalties based on the severity of the message. All of this is done without manual review, so bad actors will be penalized immediately.
To prevent accidental flagging, the system has been coded to take the player's chat history into account. So, the AI will decide whether to punish them based on that history.
Now, things can get heated during a competitive match, which may lead to some spouting toxic messages. The company said that the AI simply removes these messages and those who typed them will not be banned outright. But if these players continue with their toxic messaging, they may receive a temporary chat ban as a result.
Players who insist on making inappropriate remarks will get longer and more severe punishments. In fact, if things go out of hand, the AI may be forced to impose a harsh penalty where repeat offenders can no longer use the game's chat features ever again!
Now, the company said that no automated tool is perfect. Since this is just the first iteration of the AI chat moderation system, it may incorrectly flag players from time to time. Omeda Studios assures the community that it will continue to tweak parameters to fine-tune the AI tool's performance over time.
The company is hoping that with the AI-powered chat moderation system, the community will see a strong improvement in the health of the in-game chat experience, making Predecessor more fun to play.
But what do you think? Could this new system really help curtail toxicity in Predecessor?