Twitter Inc removed a feature in the past few days that promoted suicide prevention hotlines and other safety resources to users looking up certain content, according to two people familiar with the matter who said new owner Elon Musk ordered it.
After publication of this story, Twitter head of trust and safety Ella Irwin told Reuters in an email that “we have been fixing and revamping our prompts. They were just temporarily removed while we do that.”
“We expect to have them back up next week,” she said.
The removal of the feature, known as #ThereIsHelp, had not been previously reported. It had shown at the top of specific searches contacts for support organisations in many countries related to mental health, HIV, vaccines, child sexual exploitation, COVID-19, gender-based violence, natural disasters and freedom of expression.
Its elimination had led to increased concerns about the well-being of vulnerable users on Twitter. Musk has said that impressions, or views, of harmful content, are declining since he took over in October and has tweeted graphs showing a downward trend, even as researchers and civil rights groups have tracked an increase in tweets with racial slurs and other hateful content.
In part due to pressure from consumer safety groups, internet services including Twitter, Google and Facebook have for years tried to direct users to well-known resource providers such as government hotlines when they suspect someone may be in danger.
In her email, Twitter’s Irwin said: “Google does really well with these in their search results and (we) are actually mirroring some of their approach with the changes we are making.”
She added, “We know these prompts are useful in many cases and just want to make sure they are functioning properly and continue to be relevant.”
Eirliani Abdul Rahman, who had been on a recently dissolved Twitter content advisory group, said the disappearance of #ThereIsHelp was “extremely disconcerting and profoundly disturbing.”
Even if it was only temporarily removed to make way for improvements, “normally you would be working on it in parallel, not removing it,” she said.
Washington-based AIDS United, which was promoted in #ThereIsHelp, and iLaw, a Thai group mentioned for freedom of expression support, both told Reuters on Friday that the disappearance of the feature was a surprise to them.
AIDS United said a webpage that the Twitter feature linked to attracted about 70 views a day until December 18. Since then, it has drawn 14 views in total.
Damar Juniarto, executive director at Twitter partner Southeast Asia Freedom of Expression Network, tweeted on Friday about the missing feature and said “stupid actions” by the social media service could lead his organisation to abandon it.
The sources with knowledge of Musk’s decision to order the removal of the feature declined to be named because they feared retaliation. One of them said millions of people had encountered #ThereIsHelp messages.
Twitter had launched some prompts about five years ago and some had been available in over 30 countries, according to company tweets. In one of its blog posts about the feature, Twitter had said it had responsibility to ensure users could “access and receive support on our service when they need it most.”
Alex Goldenberg, lead intelligence analyst at the non-profit Network Contagion Research Institute, said prompts that had shown in search results just days ago were no longer visible by Thursday.
He and colleagues in August published a study showing that monthly mentions on Twitter of some terms associated with self-harm increased by over 500% from about the year before, with younger users particularly at risk when seeing such content.
“If this decision is emblematic of a policy change that they no longer take these issues seriously, that’s extraordinarily dangerous,” Goldenberg said. “It runs counter Musk’s previous commitments to prioritise child safety.”
Musk has said he wants to combat child sexual abuse content on Twitter and has criticised the previous ownership’s handling of the issue. But he has cut large portions of the teams involved in dealing with potentially objectionable material. (Reuters)