No child should be given access to that type of content
US multinational tech giant Meta is aware of the harm its content can cause to the most vulnerable of us, yet it is failing to act. The consequences for young people are devastating.
US multinational tech giant Meta is aware of the harm its content can cause to the most vulnerable of us, yet it is failing to act. The consequences for young people are devastating.
First published: Oct 2022.
The family of Molly Russell have finally received the answers they have been fighting for for five years following an inquest into her death.
Molly, 14, died in November 2017 after viewing thousands of photos depicting or promoting self-harm and suicide online.
The Russell family have long argued their daughter would still be alive had it not been for the hideous material she saw online.
In a landmark conclusion, senior coroner Andrew Walker has now refused to rule her cause of death as suicide, stating it would not be safe to do so. Instead, he stated Molly died from “an act of self-harm while suffering from depression and the negative effects of online content”.
He also issued a Prevention of Future Deaths (PFD) notice – which Molly’s family has asked be sent to Instagram, Pinterest, media regulator Ofcom and the Department for Digital, Culture, Media and Sport – recommending actions to avoid a repeat of the tragedy.
Over a period of two weeks, the inquest learned of Molly’s use of social media and heard how of 16,300 posts she had saved, shared or liked on Instagram in the months leading up to her death, 2,100 were related to depression, self-harm or suicide.
Just a fraction of the Instagram posts she had been able to view were shown in court, with a severe warning being given due to their disturbing nature.
Much of the graphic material romanticised acts of self-harm and, due to the use of algorithms, Molly was further exposed to text, images and video clips she had not requested.
This allowed her to binge-scroll through hundreds of posts that only served to worsen her mental health with some of the content even discouraging reaching out for help.
Some of the posts portrayed self-harm and suicide as an inevitable consequence of a condition that could not be recovered from.
No child should be given access to the type of content Molly was able to see.
Executives from Meta – Instagram’s parent company – and Pinterest were called to give evidence in court, and both apologised for the content that had been accessible to her.
While Pinterest conceded the platform was not safe at the time of the teenager’s death, shockingly, Meta argued that the majority of graphic content she viewed was safe for children.
No child should be given access to the type of content Molly was able to see. | Unsplash/Solen Feyissa
Although it admitted a small number of the posts violated its policies, Meta claimed the majority of the graphic content Molly viewed was appropriate as it allowed users to “express themselves”.
The brazen comments come after revelations from leaked documents that suggest Meta is aware of the harm its content can cause, yet is failing to act.
Molly’s family has paid the price for the failure of social media companies to protect her. It is now calling on them to end the “toxic corporate culture” and start protecting young people instead of monetising their misery.
In the wake of the inquest, the children’s commissioner has published research which found that 45% of children between the ages of 8 and 17 have seen content they felt was inappropriate, or made them feel worried or upset.
Only 50% reported harmful content to the social media platform, and a quarter of those saw no action from their report.
Good Law Project believes the death of Molly Russell must be a turning point in online safety, for social media platforms to shun responsibility and refuse to take action is inconceivable.
The Government also has the opportunity to make the internet a safer space through its long-promised Online Safety Bill which is yet to return to Parliament after a delay.
We have launched a campaign to tackle the detrimental impact social media can have on young people. We want to force companies to stop using recommender algorithms in certain contexts which can further expose users to harmful content.
Along with the legal action, we are telling the anonymous stories of those who have been affected by social media through a carefully managed curated website.
GOING FURTHER:
— AUTHORS —
▫ Good Law Project, a not-for-profit campaign organisation that uses the law to protect the interests of the public. ⚖️ GoodLawProjectonly exists thanks to donations from people across the UK. If you’re ina position to support their work, you can do so here. |
Sources
- Text: This piece was originally published in GoodLawProject and re-published in PMP Magazine on 14 October 2022, with the author’s consent. | The author writes in a personal capacity.
- Cover: Adobe Stock/myboys.me.
[Read our Comments Guidelines]