Today, Twitch released its first-ever transparency report, a lengthy, stat-based look at the platform’s safety initiatives over the past year. It contains some interesting, albeit granular, information about Twitch’s efforts to cut down on hateful conduct, sexual harassment, and even terrorist propaganda.
Certainly, the report contains many interesting numbers. Encouragingly, the company says it has made “a 4X increase in the number of content moderation professionals” over the past year, meaning that if users file a report, it’s more likely that someone will get around to responding to it in a timely manner.
Twitch also pointed to increases in the number of rule enforcements against reported users and channels. Total enforcements rose 41% over the course of the year, and the numbers reflect that in categories like hateful conduct and sexual harassment, violence and gore, nudity, and terrorist propaganda (Twitch claims this is extremely rare on its platform, but it also depends on what you classify as terrorism). The company also pointed to progress on the part of its Law Enforcement Response team, which made over 2,000 reports to the National Center for Missing & Exploited Children in 2020. Twitch, however, continues to have issues with young users making channels, leaving themselves open to potential predation.
The report contains a handful of other, similar data sets, most of which paint Twitch in a favorable light. Certainly, they’re a useful measure of Twitch’s growth in these areas, and broadly, the report mirrors similar documentation provided by platforms like Discord, Facebook, and Twitter. The problem with these kinds of reports, however, is that they have a way of appearing to say a lot while revealing very little. Twitch has offered numbers and a small amount of context, but streamers and viewers remain in the dark on major issues that came to light last year.
Twitch’s first transparency report is a start, but streamers want more