Twitch’s first transparency report is here, and long overdue


Twitch launched today its first transparency report, detailing its efforts to protect the 26 million people who visit its site every day. When it comes to transparency, the decade-old Amazon-owned service had a lot to catch up on.

Twitch benefited from a 40 percent increase in channels between early and late 2020, driven by the popularity of both live streaming technology and video games during the pandemic. However, that explosive growth is also the company’s biggest challenge when it comes to ending harassment and hatred. Unlike recorded videos, live content is often spontaneous and fleeting. Things just happen, in front of a live audience of thousands or tens of thousands. That can include anything from 11-year-olds playing live. Minecraft—Exposing them to potential predators— now-banned video game celebrity Guy “Dr Disrespect” Beahm broadcasting from a public bathroom at E3.

In its new transparency report, Twitch acknowledges this difficulty and for the first time offers specific details on how well its platform moderates. While the findings are encouraging, what Twitch has historically not been transparent speaks with the same force.

Twitch early on earned a reputation for being a hotbed of toxicity. Women and minorities broadcasting on the platform received targeted hatred from hostile audiences towards people who believed they deviated from stereotypes of gamers. Twitch’s vague guidelines around so-called “sexually suggestive” content served as fuel for self-proclaimed boob police to inform female Twitch broadcasters en masse. Volunteer moderators watched the fast Twitch chat to eliminate harassment. And for the problem streamers, Twitch relied on user reports.

In 2016, Twitch introduced an AutoMod tool, now enabled by default for all accounts, that blocks what its AI considers inappropriate messages from viewers. Like other big platforms, Twitch also relies on machine learning to flag potentially problematic content for human review. Twitch has also invested in human moderators to review flagged content. Still, a 2019 study by the Anti-Defamation League found that nearly half of surveyed Twitch users reported experiencing harassment. And a 2020 GamesIndustry.Biz report cited several Twitch employees describing how company executives were not prioritizing security tools and dismissing concerns about hate speech.

During all this time, Twitch did not have a transparency report to make its policies and internal workings clear to a user base that suffers from abuse. In an interview with WITH CABLE, Twitch’s new head of trust and security, Angela Hession, says that, in 2020, security was Twitch’s “number one investment.”

Over the years, Twitch has learned that bad faith stalkers can weaponize their vague community norms, and in 2020 released updated versions of its “Nudity and Clothing”, “Terrorism and Extreme Violence” guidelines and “Harassment and Hateful Conduct”. Last year, Twitch appointed an eight-person Safety Advisory Council, made up of streamers, anti-bullying experts and social media researchers, to write policies aimed at improving safety, moderation, and healthy streaming habits.

Last fall, Twitch brought in Hession, formerly head of security at Xbox. Under Hession, Twitch eventually banned renderings of the Confederate flag and blackface. Twitch is on fire, he says, and there’s a great opportunity for him to visualize what security looks like there. “Twitch is a service that was created to encourage users to feel comfortable expressing themselves and entertaining each other,” he says, “but we also want our community to always be and feel safe.” Hession says that Twitch has increased its content moderators four times in the last year.

Twitch’s transparency report serves as a victory lap for its recent moderation efforts. AutoMod or active moderators touched more than 95 percent of Twitch content during the second half of 2020, the company reports. People who reported receiving harassment through Twitch direct messages decreased by 70 percent in that same period. App shares increased by 788,000 in early 2020 to 1.1 million in late 2020, which Twitch says reflects its increase in users. User reports also increased during this time, from 5.9 million to 7.4 million, which Twitch again attributes to its growth. The same is true of its channel bans, which went from 2.3 million to 3.9 million.

.

Source link