News

Autcraft, a Minecraft server for autistic children, was about to celebrate its ninth anniversary when the troll attacked. They sent explicit photos and abusive messages to the autistic children on Autcraft’s social network, wreaking so much havoc that founder Stuart Duncan was forced to shut down the site. Nearly a decade of community history was lost.

For Duncan it was devastating. As an autistic gamer and father of two sons, one of whom is also autistic, he understands how gaming in a supportive community can provide a safe, reassuringly predictable space for an autistic child. Why would the troll do something so heartless? “I imagine that their lives are lacking so much happiness or love that their only sense of power is to go after the most vulnerable,” Duncan tells me.

Stories like these are depressingly common in the world of gaming, where harassment is endemic. Two-thirds of gamers have experienced toxic behaviour in online multiplayer games, according to a study by games company Unity. Anyone who has played an online shooter will be familiar with the abuse that fills your headphones and can escalate from “noob” to “kill yourself” in seconds.

Online gaming forums too are hotbeds of vitriol. “Hate raids” on Twitch — where mobs of trolls target streamers from minority backgrounds with spam and hate speech — have become so prevalent that streamers boycotted the platform last September in protest. Anti-Defamation League research shows that marginalised groups suffer worst, and underlines that online abuse can cause real-world harm, with 64 per cent of players feeling emotionally affected by attacks and 11 per cent of them reporting depressive or suicidal thoughts as a result.

Such toxicity is not limited to gaming. It exists all over the internet, particularly on social media where news feed algorithms promote the most provocative content. But gaming seems to get the worst of it. This is partly because games afford anonymity, which can reduce the empathy, restraint and accountability people feel — a principle known as the “online disinhibition effect”. Meanwhile the internet’s vast potential for connectivity creates vast, eager audiences for trolls and allows groups of harassers to organise online mobs easily, catalysing harassment campaigns such as 2014’s Gamergate.

Yet individual trolls are only part of the problem. They thrive in a permissive culture that normalises such behaviour. This is perhaps due to the demographic slant of early gaming — while gamers are now a diverse group, their customs were mostly enshrined in the 1980s and 1990s by teenage boys. In gamer culture, “trash talk” and dark humour are the norm, as modelled by popular streaming personalities. While not every instance is abusive, the line between well-intentioned ribbing and harassment is razor-thin and utterly subjective. If young gamers grow up seeing this language and behaviour accepted as part of the culture, they will replicate rather than question it.

Why aren’t the companies doing more to address this? It might be because developers are weathering their own storms of toxicity. Since 2020 there has been a series of revelations about harassment and abuse within many of gaming’s biggest companies, including Activision Blizzard, Sony, Riot and Ubisoft. How can we expect gamers to behave themselves when developers are just as bad?

So what actually can be done to tackle trolls? The first step lies within the community — the Unity survey found that the majority of players ignore antisocial behaviour when they see it. While it can be hard to call out harassers, ignoring them only normalises and perpetuates such behaviour. Beyond direct confrontation, trolls can also be reported using in-game moderation tools.

Most games include options to mute or block problem players, yet these strategies mask the problem rather than prevent it. Ultimately, the responsibility to stop trolls cannot be dumped on players, who have little power to alter game worlds. The change needs to come from developers.

The fact that the situation has been allowed to get this bad also tells a bigger story about the values of tech corporations. They create platforms often without considering the ethical and safety implications, and then avoid addressing problems until they can’t be ignored any longer. Yet these companies have created the spaces and they profit from them. It’s up to them to take responsibility and find a solution.

While we wait, Stuart Duncan isn’t letting his troll win. He’s working 16-hour days to rebuild the Autcraft site on a new platform where he will have more power to tackle abusers himself. Like every gamer, he understands that the majority of the gaming community is characterised by kindness and a passion for play. “It’ll grow all over again,” he says. “It’s just going to take time.”

Articles You May Like

Syrian rebels push towards Damascus as Assad’s grip on power wanes
Michigan officials unite behind revenue sharing trust fund idea
Focus on primary slate; Mutual fund inflows top $1B
St. Petersburg approves $333.5 million for Rays’ new stadium
Ten years after: Detroit’s decade after leaving Chapter 9