Twitch policy update clarifies exactly what it means by 'sexual harassment'

Twitch logo
(Image credit: MARTIN BUREAU via Getty Images)

Twitch has issued new "clarifications" to its sexual harassment policy in a bid to make it easier to understand, released alongside an update to its AutoMod moderation tools designed to let streamers curb inappropriate messages in chat.

In a blog post released yesterday, Twitch wrote that its sexual harassment policy "remains largely unchanged," but that it was adding on a chunk of clarifying language to—it hopes—draw clearer boundaries around the kinds of behaviour that aren't accepted on its platform.

The new language gets a lot more particular about exactly what Twitch doesn't want to see in its chats. "We prohibit unwanted comments–including comments made using emojis/emotes–regarding someone's appearance or body, sexual requests or advances, sexual objectification, and negative statements or attacks related to a person's perceived sexual behaviours or activities, regardless of their gender," read the update guidelines, "We also do not tolerate the recording or sharing of non-consensual intimate images or videos under any circumstances, and may report such content to law enforcement."

In other words, Twitch is now trying to give its audience a robust list of specific behaviours it can use as a reference when deciding what's kosher or not to post in chat, rather than a broader and more vague ban on creepy behaviour. 

The platform goes on to emphasise that it's (as you might expect) really the "non-consensual" element of that stuff it has a problem with, and encourages anyone who thinks they've been targeted for "mutual, consensual comments" to file an appeal. It notes, though, that some comments—like "expressing a desire to commit sexual violence"—are never allowed on Twitch regardless of any other factors.

Twitch is also updating AutoMod, the machine learning gizmo streamers can use to nip inappropriate comments in the bud automatically, to "better combat sexual harassment.

"We’ve developed a new AutoMod category that will allow you to filter out chat messages that could be considered sexual harassment," says Twitch, "This new category will provide an additional layer of protection, and can help to block those messages in the moment, before they show up in chat." 

Streamers will be able to decide how strict AutoMod is about withholding potentially problematic comments. Alongside their mods, they'll be able to review held messages to determine what to do with them. The new tools rolled out yesterday, although they're currently English-only.

Sexual harassment is, of course, absolutely endemic in online spaces, and Twitch is no exception. Numerous streamers—the vast majority of them women—have plenty of horror stories of viewers crossing the line in one way or another, whether it's creepy comments or outright stalking. Tools like these are an unfortunate necessity, though by no means adequate to stomp the problem out entirely. 

TOPICS
Joshua Wolens
News Writer

One of Josh's first memories is of playing Quake 2 on the family computer when he was much too young to be doing that, and he's been irreparably game-brained ever since. His writing has been featured in Vice, Fanbyte, and the Financial Times. He'll play pretty much anything, and has written far too much on everything from visual novels to Assassin's Creed. His most profound loves are for CRPGs, immersive sims, and any game whose ambition outstrips its budget. He thinks you're all far too mean about Deus Ex: Invisible War.