close

Minecraft needs to invest in more hate moderation, ADL study finds

This is what the Anti-Defamation League (ADL) recommends. Minecraft Investing in moderating content and creating more robust community guidelines after the organization spent three months examining anonymized in-game chat data. The study, conducted in collaboration with Take This, the Middlebury Institute of International Studies and GamerSafer, specifically focused on the how Minecraft deals with hate speech.

The researchers chose Minecraft not just because it’s popular (though it certainly is with 141 million active players), but also because, as the ADL puts it, the “decentralized, player-driven nature of Minecraft Java Edition offers a novel way to drive hate and… Evaluate annoyance while gaming spaces.” And although the results are specific to the ADL Minecraftthe recommendations resonate in most online spaces.

Just a taste — Minecraft is huge, with many, many servers where players congregate, so it’s impossible to really study the entire universe. Therefore, the researchers focused on three servers of different sizes and target groups:

  • Server 1 contained about 20,000 players, mostly 14 to 18 years old, with strict rules enforcement and a player to moderator ratio of about 464:1. This server, the most active, contained about 94 percent of the study’s data.
  • Server 2 had around 1,000 players, averaging 15 to 20 years old, and only two moderators with little to no moderation enforcement.
  • Server 3 only had 400 players, with an impressive 41:1 player to moderator ratio and extensive and active moderation. Most of the players here were at least 16 years old.

Using GamerSafer’s plugins, the researchers were able to track 458 disciplinary actions against 374 different users. Researchers analyzed both these formal reports and textual analysis of chat logs to locate patterns and unreported hate speech.

Bans work — The researchers found that temporary bans — one of the moderators’ primary tools to keep players in check — are for the most part effective in reprimanding bad behavior. Temporary bans were used as a moderation technique in 46 percent of all cases examined.

In the three protocol months examined here, a whopping 40 percent of all formal reports were filed for “hacking” (exploitation of prohibited advantages); 16 percent were reported for harassment, another 10 percent for hatred and 9 percent for sexual misconduct. Of the 1,463,891 messages examined, 2 percent were classified as highly toxic; 1.6 percent were classified as sexually explicit; and 0.5 percent were classified as hateful (many targeting sexuality and gender).

Time to Invest — The conclusions of the ADL are succinct: human moderation works, and Minecraft should invest more resources in this moderation. Gaming servers with more moderators and stricter policies had the fewest incidents of hate and harassment. The ADL also recommends this Minecraft Improve researchers’ access to data so that the effectiveness of moderation can be studied with even more insight.

At a broader level, the ADL recommends that the gaming industry as a whole standardize its moderation reporting practices. This would allow for more extensive research into how to minimize hate speech and harassment in gaming spaces.

Leave a Comment