Twitch’s first transparency report arrived – and a long time ago

Twitch launched today its first transparency report, detailing its efforts to protect the 26 million people who visit its website daily. When it comes to transparency, Amazon’s decade-old service had a lot to update.

Twitch benefited from a 40% increase in channels between the beginning and the end of 2020, driven by the popularity of live streaming technology and video games during the pandemic. This explosive growth, however, is also the company’s biggest challenge when it comes to ending harassment and hatred. Unlike recorded videos, live content is often spontaneous and ephemeral. Things just happen, in front of a live audience of thousands or tens of thousands. This can include since 11 year olds playing live Minecraft—Exposing them to potential predators – the now banned celebrity from the game Guy “Dr. Disrespect ”Beahm broadcasting from a public toilet at E3.

In its new transparency report, Twitch acknowledges this difficulty and for the first time offers specific details on how well it moderates its platform. While the findings are encouraging, what Twitch has historically not been transparent about speaks as loudly.

Twitch soon gained a reputation for being a focus of toxicity. Women and minorities transiting the platform received targeted hatred from audiences hostile to people who believed they were deviating from the stereotypes of players. Twitch’s vague guidelines around what is called “sexually suggestive” served as fuel for the self-styled anti-boob police to make mass reports of Twitch women. Volunteer moderators observed Twitch’s quick chat to kick off the harassment. And for troubled streamers, Twitch relied on user reports.

In 2016, Twitch introduced an AutoMod tool, now enabled by default for all accounts, which blocks what its AI considers inappropriate messages from viewers. Like other large platforms, Twitch also relies on machine learning to flag potentially problematic content for human review. Twitch invested in human moderators to review the flagged content as well. Still, a 2019 study by the Anti-Defamation League found that almost half of Twitch users surveyed reported experiencing harassment. And a 2020 report from GamesIndustry.Biz quoted several Twitch employees describing how company executives did not prioritize security tools and dismissed concerns about hate speech.

Throughout that time, Twitch did not have a transparency report to make his policies and internal work clear to an abused user base. In an interview with WIRED, Twitch’s new head of trust and security, Angela Hession, says that in 2020, security was Twitch’s “number one investment”.

Over the years, Twitch has learned that bad faith harassers can transform their vague community standards and, in 2020, released updated versions of their “Nudity and Attire”, “Terrorism and Extreme Violence” and “Harassment and Conduct” guidelines. Hatred”. Last year, Twitch appointed an eight-person Security Advisory Board, consisting of streamers, anti-bullying experts and social media researchers, who would develop policies aimed at improving security and moderation and healthy streaming habits.

Last fall, Twitch brought Hession, previously the head of Xbox security. Under Hession, Twitch finally banned representations of the Confederate flag and the black face. Twitch is on fire, she says, and there is a great opportunity for her to imagine what security will be like there. “Twitch is a service that was created to encourage users to feel comfortable to express themselves and be entertained,” she says, “but we also want our community to be always and feel safe.” Hession says Twitch has increased its content moderators four times in the past year.

Twitch’s transparency report serves as a victory lap for his recent moderation efforts. AutoMod or active moderators played more than 95 percent of Twitch content during the second half of 2020, reports the company. People reporting that they received harassment via direct Twitch messages decreased by 70% in the same period. Inspection actions increased by 788,000 in early 2020 to 1.1 million in late 2020, which Twitch says reflects the increase in users. User reports have also increased during that time, from 5.9 million to 7.4 million, which Twitch again attributes to its growth. The same for channel bans, which increased from 2.3 million to 3.9 million.

.Source