In response to the death of a British teenager, Molly Russell, Instagram has announced that it will now ban any graphic self-harm image.
The social media platform made the decision – which was said necessary but long overdue by critics – in response to a flood of public anger over the suicide of 14-year-old Molly, whose Instagram page contained lots of distressing material about suicide and depression.
After days of increasing pressure on Instagram concluding a meeting with Matt Hancock, health secretary, the head of social network Adam Mosseri agreed that the company had not done well enough and concluded that explicit imagery of self-harm would no longer be permitted on the platform.
Mosseri stated that “We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable, We will get better and we are committed to finding and removing this content at scale.”
The move following notable public fury over Molly’s death. Her father Ian Russell said Instagram was to be partly blamed. The family saw materials relating to suicide and depression when they looked into Molly’s IG account after her death.
Instagram announced a series of measures, including removal of non-graphic images of self-harm from its app and website, which seemed plans to draw a line under the reputational crisis for the brand and its father company Facebook.
Critics have said that the changes should have been made already and remained doubtful if they would be enough to solve a problem that is said to have grown unchecked for 10 years.
Peter Wanless, CEO NSPCC has said that Instagram had taken “an important step”, but that other social networks were also falling short and that legislation would be very necessary.
“It should never have taken the life of Molly Russell for Instagram to act. Over the last decade, social networks have proven over and over that they won’t do enough.”
Wanless said that it wasn’t enough to wait until “the next tragedy strikes”, calling on the government to act without delay and impose a duty of care on social networks, with tough punishments for those who fail to protect their minor users.
“At-risk individuals will not be safe until Facebook takes its role as a global corporation and communications platform more seriously. These changes should have been made years ago.”
Hancock said: “Social media companies need to do more, in particular, to remove material that encourages suicide and self-harm, so I’m going to be asking other social media companies to act.”
“I don’t want people to go on to social media and search for images about suicide to get directed to yet more of that sort of imagery. They need help to not post more about suicide.”
Mosseri remorsefully accepted that the change was overdue, he said: “We have not been as focused as we should have been on the effects of graphic imagery of anyone looking at content.
“That is something that we are looking to correct and correct quickly. It’s unfortunate it took the last few weeks for us to realize that. It’s now our responsibility to address that issue as quickly as we can.”
In an interview on BBC Radio, digital minister, Margot James, said that “the government would have to keep the situation very closely under review to make sure that these commitments are made real and as swiftly as possible”.
Mosseri added that some self-harm images could be allowed to remain on Instagram. “I might have an image of a scar and say, ‘I’m 30 days clean,’ and that’s an important way to tell my story,”
“That kind of content can still live on the site but the next change is that it won’t show up in any recommendation services so it will be harder to find.”
The government is now considering imposing a compulsory code of ethics on tech companies, which would be accompanied by fines for defaulters.
Jeremy Wright, the culture secretary, is due to unveil the government’s proposals at the end of February, helping to spur Facebook into quick action.