Discord wiped out thousands of criminal servants and violent extremists in 2020

Illustration for the article entitled Discord eliminated thousands of criminal servers and violent extremists in 2020

Photograph: Samuel Corum (Getty Images)

Thanks to the infinitely depressing extent with which greed has kept everyone trapped inside, Discord is more relevant than ever. But, as the company revealed in your latest transparency report, which has created new challenges – and improved efforts to address other challenges that you probably should have worked on earlier.

Discord, which is supposed to in negotiations with Microsoft to sell for about 1.3 Bethesdas, released the transparency report today. Among the standard operational perceptions about the second half of the 2020 Discord, some details stood out. On the one hand, the overall number of user reports increased quite steadily in 2020 – from 26,886 in January to 65,103 in December – with the number initially increasing in March. That makes sense; people were trapped in their homes, and Discord was growing rapidly as a result. Spam resulted in the majority of account exclusions (more than 3 million), with exploitative content, including non-consensual pornography, coming in a distant second (129,403) and harassment in third (33,615).

Discord also pointed out that, according to reports, it has acted more frequently against issues involving material harmful to children, cyber crimes, doxxing, exploitative content and extremist or violent content. “This can be partly explained by prioritizing the team of issues in 2020 that were most likely to cause damage in the real world,” said the company in the transparency report.

In fact, according to the report, Discord removed more than 1,500 servers for violent extremism in the second half of 2020, which, he said, was “almost a 93% increase over the first half of the year”. He cited groups like Boogaloo Boys and QAnon as examples.

“This increase can be attributed to the expansion of our anti-extremism efforts, as well as the growing trends in the online extremism space,” wrote the company. “One of the online trends observed in this period was the growth of QAnon. We adjusted our efforts to deal with the movement – finally removing 334 servers unrelated to quality control. “

Exclusions of cybercrime servers also skyrocketed throughout 2020, increasing by 140% over the first half of the year. In total, Discord removed almost 6,000 servers for cybercrime in the second half of 2020, after a significant increase in reporting. “More cyber crime venues than ever have been marked as Trust & Safety, and more have finally been removed from our website,” wrote Discord.

Discord also emphasized his focus on methods that will “proactively detect and remove the most damaging groups from our platform”, pointing out his efforts against extremism as an example, but also looking at where he made a mistake.

“We were disappointed to realize that, in this period, one of our tools to proactively detect [sexualized content related to minors] servers contained an error, ”wrote Discord. “As a result, there were fewer general flags for our team. This error has now been resolved – and we have resumed removing the servers from the tool surfaces. “

The other problem here is that Discord made a concerted effort to remove QAnon content around the same time as other platforms– after most of the damage had already been done. Although the removal may have been proactive according to Discord’s internal definition, the platforms even took time to behave reactively when it came to QAnon as a whole, which led to real and lasting damage in the United States and all over the world. In 2017, Discord also functioned as a big stage for Unite The Right meeting in Charlottesville, Virginia, which finally led to violence and three deaths. Although the platform has tried to clean up its operation ever since, was the host of an abundance of abuse and alt-right activity until 2017.

Some transparency is much better than none, but it’s worth noting that technology companies’ transparency reports oftengive a little insight into how decisions are made and the top priorities of the platforms that essentially govern our online lives. Earlier this year, for example, Discord banned the r / WallStreetBets server at the height of GameStop stonksapalooza. Viewers suspected dirty play – some kind of outside interference. Speaking for Kotaku, although, two sources made it clear that labyrinthine policies of internal moderation eventually led Discord to make that decision. Bad weather and substandard transparency before and after took care of the rest.

This is just a small example of how this dynamic can work. They exist a lot more. The platforms may say that they are being transparent, but ultimately, they are just giving people a bunch of badly contextualized numbers. It’s hard to say what real transparency looks like in the age of technology platforms that cover everything, but that’s not it.

.

Recommended Stories

.Source