Instagram promises to remove all graphic self-harm images after the father of a teenager who took her own life in 2017 said the photo-sharing app ‘helped kill my daughter’
- Instagram head Adam Mosseri, says graphic self-harm images will be removed
- Ian Russell, whose daughter Molly died aged 14, said Instagram contributed
- Health Secretary Matt Hancock said social media companies need to do more
Instagram has pledged to ban all graphic images of self-harm after the father of a teenager who took her own life said that the app had helped to kill his daughter.
The photo-sharing platform announced a series of changes to its content rules including a ban on graphic images of self-harm and the removal of non-graphic images of self-harm from searches, hashtags, and the explore tab.
Health Secretary Matt Hancock earlier said that social media companies ‘need to do more’ to curb their impact on teenagers’ mental health.
However, Instagram added in their announcement that it will not be entirely removing non-graphic self-harm content, as it does not ‘want to stigmatise or isolate people who may be in distress’.
14-year-old Molly took her own life after handing in her homework and returning to her family home
Parents of schoolgirl Molly Russell urge web giants to give…
Ban phones in your child’s bedroom and at the dinner table,…
Share this article
Head of the social network, Adam Mosseri, said: ‘Nothing is more important to me than the safety of the people who use Instagram. We are not where we need to be on self-harm and suicide, and we need to do more to protect the most vulnerable in our community.
‘I have a responsibility to get this right. We will get better and we are committed to finding and removing this content at scale, and working with experts and the wider industry to find ways to support people when they’re most in need.’
The site’s bosses met the Health Secretary on Thursday afternoon to discuss content on suicide and self-harm.
Before the meeting, Mr Hancock said: ‘Social media companies need to do more, in particular, to remove material that encourages suicide and self harm, so I’m going to be asking other social media companies to act.
Father Ian Russell said his daughter took her own life after looking at pictures on the social network that glorified self-harm and suicide
‘I don’t want people to go onto social media and search for images about suicide to get directed to yet more of that sort of imagery. They need help to not post more about suicide.
‘I think that parents around the country and across the world worry about the impact of social media on teenagers’ mental health.’
He added: ‘We’re not going to rest until we tackle this problem.’
Mr Hancock’s meeting with the internet giant came after the father of a teenager who took her own life in 2017 told the BBC that Instagram ‘helped kill my daughter’.
Molly Russell died in 2017 aged 14. Her family found material relating to depression and suicide when they looked at her Instagram account after her death.
Father Ian Russell said that the algorithms used by Instagram enabled Molly to view more harmful content, possibly contributing to her death.
He said: ‘We are very keen to raise awareness of the harmful and disturbing content that is freely available to young people online.
‘Not only that, but the social media companies, through their algorithms, expose young people to more and more harmful content, just from one click on one post.
Health and Social Care Secretary Matt Hancock met with Instagram boss Adam Mosseri
‘In the same way that someone who has shown an interest in a particular sport may be shown more and more posts about that sport, the same can be true of topics such as self-harm or suicide.’
Mr Russell also told the BBC: ‘I have no doubt that Instagram helped kill my daughter. She had so much to offer and that’s gone.’
The NSPCC said the rule changes marked ‘an important step’, but that social networks were still not doing enough to tackle self-harm.
In response to the news, CEO Peter Wanless said: ‘This is an important step by Instagram towards cracking down on self-harm content that no child should ever be exposed to.
‘It shows what can be done, but it should never have taken the death of Molly Russell for Instagram to act. The question is whether it will be enough?
‘Over the last decade social networks have proven over and over that they won’t do enough to design essential protections into their services against online harms including grooming and abuse.
‘We cannot wait until the next tragedy strikes. The Government must legislate without delay and impose a duty of care on social networks, with tough punishments if they fail to protect their young users.’
For confidential support, log on to samaritans.org or call the Samaritans on 116123.
Source: Read Full Article