Nick Ferrari 7am - 10am
Senior coroner urges review into children's social media access after inquest into the death of teenager Molly Russell
14 October 2022, 14:13 | Updated: 14 October 2022, 20:12
A senior coroner has sent recommendations to social media companies urging the separation of content for adults and children following the inquest into the death of schoolgirl Molly Russell.
Coroner Andrew Walker sent a Prevention of Future Deaths report (PFD) to social media companies such as Meta, Pinterest, Snapchat and Twitter as well as the UK Government yesterday.
In it, he said there should be a review of the algorithms used by the social media platforms to provide content.
Molly, a 14-year-old from Harrow, north-west London, ended her life in November 2017 after viewing suicide and self-harm content online.
Her death prompted her family to campaign to improve safety online.
Read more: Expectation of major Truss U-turn grows as pressure on PM builds
In his report, the coroner found six areas of concern during the inquest into her death, including the separation of content for children and adults.
Mr Walker raised concerns over age verification when people are signing up for social accounts, suggested considering the provision of age specific content, and a government review into the use of advertising.
He recommended a Government review over parent or guardian control, including access to material viewed by children, and retention of material viewed by children.
At the inquest, which was held at North London Coroner’s Court last month, Mr Walker concluded that the schoolgirl died while suffering from the “negative effects of online content”.
The inquest heard that before her death, Moilly accessed content from the “ghetto of the online world”, with her family arguing websites such as Pinterest and Instagram recommended posts or accounts that “promoted” self-harm and suicide.
In his evidence, Pinterest’s Judson Hoffman told the inquest the site was “not safe” when the Molly used it.
Elizabeth Lagone, an executive at Meta, told the inquest said she believed posts Molly saw, which the girl's family say “encouraged” suicide, were safe.
Mr Walker’s report said: “I recommend that consideration is given to the setting up of an independent regulatory body to monitor online platform content with particular regard to the above.
“I recommend that consideration is given to enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content.
“Although regulation would be a matter for Government I can see no reason why the platforms themselves would not wish to give consideration to self-regulation taking into account the matters raised above.”
He said he thought action should be taken to prevent future deaths, adding: “I believe you and/or your organisation have the power to take such action.”
Reacting to his recommendations, Molly’s dad Ian Russell said: “We welcome this report by the coroner, which echoes our concerns about the online dangers Molly was exposed to, and pushed towards by the platforms’ algorithms.
“We urge social media companies to heed the coroner’s words and not drag their feet waiting for legislation and regulation, but instead to take a proactive approach to self-regulation to make their platforms safer for their young users.
“They should think long and hard about whether their platforms are suitable for young people at all.
“The Government must also act urgently to put in place its robust regulation of social media platforms to ensure that children are protected from the effects of harmful online content, and that platforms and their senior managers face strong sanctions if they fail to take action to curb the algorithmic amplification of destructive and extremely dangerous content or fail to remove it swiftly.
“I hope this will be implemented swiftly through the Online Safety Bill which must be passed as soon as possible.”
Responding to the PFD report, Meta, Instagram's parent company said it agreed that “regulation is needed”.
The social media company said it was “reviewing” the findings, adding: “We don’t allow content that promotes suicide or self-harm, and we find 98% of the content we take action on before it’s reported to us.
“We’ll continue working hard, in collaboration with experts, teens and parents, so we can keep improving.”
Image sharing service Pinterest also put out a statement in response to the report, which said: “Pinterest is committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner’s report will be considered with care.”
Pinterest, Meta, Snapchat and Twitter all have 56 days to produce a timetable of action they propose to take or explain why no action is proposed.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org. Alternatively, letters can be mailed to: Freepost SAMARITANS LETTERS.