Modulate: how AI-assisted voice moderation tools can help combat toxicity

6 months 1 week ago

Voice moderation is a sensitive issue. Players expect privacy, but long gone are the halcyon days of early, friendly online gaming. Today, when players interact with strangers in online games it can all too often lead to toxic behaviour. Striking the balance between player privacy and safety for online communities is the challenge facing games studios today.

Boston-based start-up Modulate wants to help game companies clean up toxic behaviour in their games with machine learning-based tools that promise to empower moderators and protect players.

Modulate CEO Mike Pappas told GamesIndustry.biz why its voice-native moderation tool ToxMod is more beneficial than old forms of grief reporting, why studios should build effective codes of conduct amid changing online safety regulations and how their technology and guidance is helping to make online communities safer.

Read more

Author
Sponsored Article