Confronting Toxicity in Gaming: Going Beyond “Mute”

Confronting Toxicity in Gaming: Going Beyond “Mute”

This essay, by Tony Xiao, age 15, is one of the Top 12 winners of our Sixth Annual Student Editorial Contest, for which we received 10,509 entries.

We are publishing the work of all the winners and runners-up this week, and you can find them here as they post. Excerpts from some will also be in the special Learning print section on Sunday, June 9.


Confronting Toxicity in Gaming: Going Beyond “Mute”

The recent spate of white nationalist violence has raised concerns about the role online platforms play in the radicalization of attackers. Analysts have noted the disturbing tendency of YouTube algorithms to lead users to extreme content. Others have bemoaned social media’s role in the viral propagation of racially charged fake news. While internet companies are finally starting to respond (Facebook recently announced a ban on white nationalist content), there remains one lesser-mentioned vehicle for racial desensitization: online gaming.

I don’t mean the violent content of online games. Violent content is a boogeyman over-hyped by pundits. I’m referring to the racist, anti-Semitic way gamers are indoctrinated to speak to each other in the depersonalized realm of online competition. The ritual, similar to fraternity hazing, happens something like this:

A new gamer, let’s call him “Joe,” joins a game of Minecraft, a pixelated world-building game with 100 million active players. Joe tells his teammates he’s new to the game. When he drags his team down, his teammates begin to trash-talk him, firing racist, sexist and homophobic insults his way. After this bout of shaming, Joe builds his skill level. Months later, Joe queues up for a game, and sees a novice assigned to his team. After finally losing because of his teammate’s poor skills, he insults the player using the same script he had been abused by months earlier. Joe is now a part of the toxic cycle.

Prominent gaming companies like Blizzard and Riot have started creating systems to combat the hate speech rampant in gaming communities. Certain platforms temporarily mute players after instances of racist profanity. But in most cases, these measures are perfunctory, amounting to a slap on the wrist. Players evade censors easily by omitting letters or adding numerals to ethnic slurs written in game chats.

Gaming companies need to step up their efforts by punishing abusive players with meaningful competitive penalties. E-sports can look to an obvious model: real-world sports. Violence on the hockey rink takes a player off the ice for critical game time. Tennis players can be docked points, games or even matches for verbal abuse. In the world of e-sports, a similar dynamic might include lower maximum health, longer skill cool-down periods, or other handicaps. Unless penalties come down in a manner meaningful to players, hate speech will continue to flourish.

Players should self-monitor and realize that the racially-charged insults they hurl have real-world consequences. But, knowing the culture as it exists now, perhaps that ship has sailed. Such a deeply rooted problem calls for an strong, top-down approach. It’s time the gaming industry understood that it has a responsibility to stem the spread of hate on its platforms.

Works Cited

Moore, Bo. “Major Game Companies Are Teaming Up to Combat Toxicity in Gaming.” PC Gamer, 22 March 2018.

Schiesel, Seth. “The Real Problem With Video Games.” The New York Times, 13 March 2018.

Weill, Kelly. “How YouTube Built a Radicalization Machine for the Far-Right.” The Daily Beast, 17 Dec. 2018.