Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

SXSW's Online Harassment Summit tackled big ideas, but few solutions in sight

Caitlin McGarry, Leah Yamshon | March 17, 2016
The event drew a low turnout despite months of controversy.

Like Reddit, Yik Yak’s policy is to take a step back from playing content police, shifting responsibility to their user base. “Communities don’t always know they have this power in their hands to take care of these things,” said Yik Yak cofounder Tyler Droll during a SXSW panel held separately from the Harassment Summit. “A part of that may be on us, but a part of that may be on the community too, of not knowing they can take care of this quickly, of standing by idly and letting that happen.”

But despite the constant name-dropping of Twitter, Reddit, and other platforms where harassment regularly occurs, those companies weren’t part of the conversation—not even Yik Yak, who didn’t participate in the summit even though both founders were in Austin that weekend. The panels were loaded with prominent academics, activists, policy makers, and others experienced dealing with or studying online harassment, but only Facebook sent a representative. 

What’s the solution?

While the people from the companies who can actually make changes weren’t around to hear suggestions, panelists floated a few ideas to curb online harassment.

Online Abuse Prevention Center founder Randi Lee Harper said during the gaming and harassment panel that she’s currently working with tech companies, who have to be convinced that harassment is a “quality of content problem” that will send users fleeing and affect their bottom line. Harper said she’s under non-disclosure agreements and couldn’t talk much about the improvements being made, but hinted that the companies she’s working with understand that harassment is a huge problem. She also pointed to the work of Civil Comments, which uses crowdsourced comment moderation, as a model for other platforms.

Platforms need more moderation, panelists suggested.

“Pay community managers well,” Cross said. “Hire lots of them.”

They need more granular privacy settings, too.

Caroline Sinders, an interaction designer for IBM Watson, had a few ideas for Twitter: “What if in moments of harassment you could turn off the comments? What if there was a way to flag a tweet so it couldn’t be embedded?”

Brianna Wu praised Twitter for making strides in the past year to improve harassment crackdowns, most recently in launching Twitter’s Trust & Safety Council to help the company “strike the right balance between fighting abuse and speaking truth to power.”

Shireen Mitchell, founder of educational nonprofit Digital Sisters, which is geared towards helping women and children from underserved communities, suggested that including a broader spectrum of people on these harassment councils will open up the conversation. “We don’t see much intersection of gender and race at these tables, so we’re not hearing an equal representation of voices,” she said.  

 

Previous Page  1  2  3  Next Page 

Sign up for CIO Asia eNewsletters.