Presented by Modulate
The trust and safety team of social gaming platform Rec Room has achieved significant results in reducing toxicity over the past 18 months. Explore the metrics, tools, and strategies they implemented to enhance player experience, boost engagement, and transform the gaming landscape in this VB Spotlight.
Ensuring player experience and safety are paramount for game developers. In a recent VB Spotlight, Mark Frumkin, director of account management at Modulate, and Yasmin Hussain, head of trust and safety at Rec Room, discussed their efforts to combat toxicity and the effectiveness of ToxMod, an AI-powered voice chat moderation solution.
Established in 2016, Rec Room is a social gaming platform with over 100 million users. The platform enables real-time interactions through text and voice chat on various devices, fostering an immersive experience through customizable avatars.
Hussain emphasized the critical role of trust and safety within Rec Room to uphold community standards and create a welcoming environment. However, managing real-time voice interactions poses challenges, particularly in addressing inappropriate player behavior.
By implementing continuous voice moderation across all public rooms and experimenting with different responses to rule violations, Rec Room saw a significant reduction in toxic behavior. Strategies such as instant muting proved effective in deterring misconduct and maintaining player engagement.
Combating toxicity one step at a time
Rec Room’s approach involved identifying and targeting specific player cohorts responsible for violations, leading to tailored interventions that yielded positive outcomes. By stacking interventions and refining moderation strategies, the team successfully tackled toxicity issues.
Creating and running test and safety experiments
Tracking key metrics related to toxicity and player behavior is essential for optimizing moderation strategies. Clear hypotheses, iterative testing, and data analysis are crucial in shaping player conduct and promoting positive community interactions.
Continuous improvement and adaptation are vital, as Rec Room strives to enhance its moderation processes and leverage AI tools like ToxMod for more effective voice moderation. Identifying pro-social behavior and fostering a positive gaming environment are key objectives for the platform.
The future of AI-powered voice moderation
AI technologies like ToxMod play a significant role in enhancing safety and player experience on platforms like Rec Room. By analyzing data, detecting policy violations, and encouraging positive behaviors, AI tools can drive lasting improvements in moderation strategies.
Looking ahead, the focus will be on developing AI capabilities to not only prevent misconduct but also promote positive interactions among players. The evolution of AI-powered moderation signifies new possibilities for maintaining community standards and ensuring a safe gaming environment.
For more insights on toxicity in games, player behavior modification, and the impact of machine learning, watch the VB Spotlight on demand.
Agenda
- How voice moderation works to detect hate and harassment
- Rec Room’s success in voice moderation strategy
- Insights from voice moderation data for game developers
- Impact of reducing toxicity on player engagement
Presenters
- Yasmin Hussain, Head of Trust & Safety, Rec Room
- Mark Frumkin, Director of Account Management, Modulate
- Rachel Kaser, Technology Writer, VentureBeat (Moderator)