Troll hunter: Twitter cracks down on abuse with new trust and safety group

After a number of high-profile defections from Twitter because of trolling, the micro-blogging service has now unveiled a new Trust and Safety Council to help guide its policies against abuse.
Written by Liam Tung, Contributing Writer

Recently several high-profile public figures have left Twitter, often after being harassed.

Image: iStock

Twitter has hatched a new plan to tackle internet trolls and ensure its users can express themselves without fear of facing a torrent of abuse.

The company's answer to its trolling problem is the new Twitter Trust and Safety Council, which initially will consist of 40 organizations from 13 regions.

Some of the inaugural members include Australian suicide help organisation Beyond Blue, the UK's child abuse watchdog, International Watch Foundation, and the US National Domestic Violence Hotline.

The idea of the council is to offer Twitter a variety of voices and expertise to help it "more efficiently and quickly" incorporate their concerns when it develops products, policies, and programs, according to Twitter's head of global policy outreach, Patricia Cartes.

"To ensure people can continue to express themselves freely and safely on Twitter, we must provide more tools and policies," Cartes said.

"With hundreds of millions of Tweets sent per day, the volume of content on Twitter is massive, which makes it extraordinarily complex to strike the right balance between fighting abuse and speaking truth," she continued.

Twitter notes on its new Trust and Safety Council page that it "does not tolerate behavior intended to harass, intimidate, or use fear to silence another user's voice".

Its policy is not to remove tweets unless they're in violation of its rules, and the company urges users to call law enforcement if they believe a conflict has turned into a credible threat online or offline.

The new council is intended to give Twitter insights from "safety advocates, academics, and researchers focused on minors, media literacy, digital citizenship, and efforts around greater compassion and empathy on the internet".

It also wants to hear from community groups that focus on preventing abuse, harassment, and bullying, as well as on mental health and suicide prevention.

The move follows a number of high-profile public figures leaving Twitter in recent times, often after being harassed on the platform.

Though Cartes doesn't mention it in her blogpost, the new council's launch coincides with Safer Internet Day 2016, which was addressed by Twitter UK's head of public policy Nick Pickles in The Guardian.

Pickles admitted Twitter had "fallen short" in the past when it came to ensuring users can speak out without fear, perhaps referring to its decision in 2013 to reduce its blocking feature to the mere "muting" of users who had been blocked.

Twitter was also accused of targeting Tor users when seeking phone number details. Its phone verification system was designed to combat trolling accounts.

However, Pickles outlined a number of improvements to the way it handles abuse complaints.

"For example, we took the decision to allow bystander reporting, meaning every Twitter user can report any tweet they see, whether or not it was directed at them specifically. We changed our rules to prohibit the posting of intimate images without consent, and also clarified the rules on hateful conduct," Pickles wrote.

He also noted that Twitter has updated its in-product reporting processes and human review systems to support abuse reports.

Read more about Twitter

Editorial standards