Thanks to the infinite and depressing extent to which covid has kept everyone trapped inside, Discord is more relevant than ever. But as the company revealed in your latest transparency report, that has led to new challenges and better efforts to tackle other challenges that you probably should have put more effort into earlier.
Discord, which apparently is in talks with Microsoft to sell for about 1.3 Bethesdas, published the transparency report today. Amid standard operational information about the second half of 2020 from Discord, a few details were highlighted. For one thing, the total number of user reports increased fairly steadily throughout 2020, from 26,886 in January to 65,103 in December, with the number initially increasing in March. This makes sense; people were trapped in their homes, and as a result, Discord was growing rapidly. Spam resulted in the highest number of account deletions (over 3 million), with exploitative content, including non-consensual pornography, in a distant second (129,403) and harassment in third (33,615).
Discord also noted that of the reports made, it most frequently took action against issues involving child harm, cybercrime, abuse, exploitative content, and extremist or violent content. “This can be partly explained by the team’s prioritization of issues in 2020 that were most likely to cause real-world harm,” the company said in the transparency report.
In fact, according to the report, Discord removed more than 1,500 servers for violent extremism in the second half of 2020, which it said was “an increase of almost 93% over the first half of the year.” He cited groups like the Boogaloo Boys and QAnon as examples.
“This increase can be attributed to the expansion of our anti-extremism efforts, as well as growing trends in the online extremism space,” the company wrote. “One of the online trends observed in this period was the growth of QAnon. We adjusted our efforts to address the move and ultimately removed 334 QAnon-related servers. “
Cybercrime server killings similarly skyrocketed over the course of 2020, increasing by 140% from the first half of the year. In total, Discord took down nearly 6,000 servers for cybercrime in the second half of 2020, which it said followed a significant spike in reporting. “More cybercrime spaces than ever were flagged on Trust & Safety, and more were eventually removed from our site,” Discord wrote.
Discord also emphasized its focus on methods that allow it to “proactively detect and remove the most harmful groups from our platform,” pointing to its anti-extremism efforts as an example, but also pointing out where it made a mistake.
“We were disappointed to realize that in this period one of our tools to proactively detect [sexualized content related to minors] the servers contained an error, ”Discord wrote. “As a result, there were fewer flags overall on our team. That bug has since been resolved, and we’ve resumed removing servers on the tool surfaces. “
The other issue here is that Discord made a concerted effort to remove the content from QAnon around the same time as other platforms“After most of the damage had already been done.” While the removal may have been proactive according to Discord’s internal definition, platforms were slow to even behave reactively when it came to QAnon as a whole, leading to real and lasting damage in the United States and all over the world. In 2017, Discord also functioned as an important stage for Unite The Right rally in Charlottesville, Virginia, which finally led to violence and three deaths. While the platform has tried to clean up its act ever since, hosted a large number of abuse and alt-right activity as recently as 2017.
A little transparency is much better than none, but it is worth noting that transparency reports from technology companies oftenprovide little information about excuse me the most important decisions and priorities are made from the platforms that essentially govern our lives online. Earlier this year, for example, Discord banned the r / WallStreetBets server at the height of GameStop stonksapalooza. Spectators suspected foul play, outside interference of some kind. Talking to Kotaku, However, Two sources made it clear that the labyrinthine policies of internal restraint ultimately caused Discord to make that decision.. Bad timing and poor transparency before and after took care of the rest.
This is just a small example of how this dynamic can play out. exist much more. Platforms may say they are being transparent, but ultimately they are just giving people a bunch of barely contextualized numbers. It’s hard to say what real transparency looks like in the age of all-encompassing technology platforms, but this is not it.