League of Legends Reporting System Has Reduced Verbal Abuse to Only 2% of Matches
The reporting system actually works.
Harassment and verbal abuse sometimes occur in competitive online games. As a result, developers often try to monitor their games to reduce instances of harassment, to varying levels of success. Riot Games, developer of League of Legends, seems to have created a system that is very effective in significantly lowering the amount of racism, sexism, and homophobia that occurs in the game.
In an article on Recode, designer of social systems Jeffrey Lin described how the introduction of League's "Tribunal" has greatly impacted the civility of matches in the game. The system works by giving players an open forum to report, view, and discuss instances of harassment in the game.
The Tribunal also allowed Riot to develop the ability to swiftly respond to reports from players about verbal abuse in a match. It can differentiate between sarcasm and malice, as well as respond to passive-aggressive comments. Consequences are tailored to the individual situations Some actions necessitate penalties, and others only trigger incentives to act better. The reporting is done by other players, giving them a voice in the proceedings.
As a result, instances of harassment have plummeted. As Lin explained, "Incidences of homophobia, sexism and racism in League of Legends have fallen to a combined 2 percent of all games. Verbal abuse has dropped by more than 40 percent, and 91.6 percent of negative players change their act and never commit another offense after just one reported penalty."
This is great news for competitive games, which have often struggled with vitriolic online interactions. Hopefully the lessons learned by Riot's efforts can be applied to other titles in the future.
Got a news tip or want to contact us directly? Email news@gamespot.com
Join the conversation