The Molly Rose Foundation says still enough is not being done to tackle harmful content online.
The Molly Rose Foundation says harmful content is being sent to young people online at a higher rate than everAuthor: Claire BoadPublished 46 minutes ago
An internet safety charity has described the new Online Safety Act as a ‘sticking plaster’ over the issue of harm caused by content online.
The Act aims to protect children and adults online, and gives social media companies and search engines legal duties to protect users from harmful and illegal content.
But, the Molly Rose Foundation says the act currently barely scrapes the surface of the issue around harmful content.
The charity was set up after 14-year-old Molly Russell took her life 8 years ago after viewing suicide content on social media.
Now, the foundations Chief Executive Andy Burrows has told us enough is still not being done to prevent young people accessing this type of content online.
”We know that in the UK we lose a young person to suicide where technology played a role every single week and this is preventable harm.
”When we see Instagram and TikTok fail to address fundamental issues with their platforms, this is a deliberate choice not to make their platforms safe”.
These calls come as new research from the charity found depression, suicide and self harm content still being recommended at a ‘vast scale’ to social media accounts opened as a 15 year old girl.
Molly’s father Ian said he is concerned the issue is not going away and, in some ways, getting worse.
”The harmful situation that lead in more than a minimal way to the death of my daughter has got worse rather than better.
”If you strayed into the rabbit hole of harmful suicide self injury content, the platforms algorithmically will recommend more and more of that content.”