Molly Russell death: Coroner suggests separate platforms for adults and children

14 October 2022, 10:54

Molly Russell
Molly Russell inquest. Picture: PA

The 14-year-old died in November 2017 after viewing suicide and self-harm content online.

The father of schoolgirl Molly Russell has urged social media companies not to “drag their feet waiting for legislation”, as a coroner issued recommendations including separate platforms for adults and children.

Coroner Andrew Walker sent a Prevention of Future Deaths report (PFD) to businesses such as Meta, Pinterest, Twitter and Snapchat as well as the UK Government on Thursday, in which he urged a review of the algorithms used by the sites to provide content.

The 14-year-old, from Harrow in north-west London, ended her life in November 2017 after viewing suicide and self-harm content online, prompting her family to campaign for better internet safety.

In his report, Mr Walker identified six areas of concern that arose during the inquest into Molly’s death, including the separation of content for adults and children.

The coroner also voiced concerns over age verification when signing up to the platforms, content not being controlled so as to be age-specific, and algorithms being used to provide content together with adverts.

Other issues included in the report were the lack of access or control for parents and guardians and the absence of capability to link a child’s account to a parent or guardian’s account.

At the inquest held at North London Coroner’s Court last month, the coroner concluded Molly died while suffering from the “negative effects of online content”.

The inquest was told the teenager accessed material from the “ghetto of the online world” before her death, with her family arguing sites such as Pinterest and Instagram recommended accounts or posts that “promoted” suicide and self-harm.

In her evidence, Meta executive Elizabeth Lagone said she believed posts seen by Molly, which her family say “encouraged” suicide, were safe.

Molly Russell inquest
Meta’s head of health and wellbeing Elizabeth Lagone gave evidence at the inquest at North London Coroner’s Court (Beresford Hodge/PA)

Pinterest’s Judson Hoffman told the inquest the site was “not safe” when the schoolgirl used it.

In light of the concerns raised, Mr Walker recommended the Government considered reviewing the provision of internet platforms to children in the PFD.

Other areas highlighted for review included separate platforms for adults and children, age verification before joining a platform, provision of age specific content, and the use of algorithms to provide content.

The coroner also recommended the Government review the use of advertising and parental, guardian or carer control including access to material viewed by a child, and retention of material viewed by a child.

Mr Walker’s report said: “I recommend that consideration is given to the setting up of an independent regulatory body to monitor online platform content with particular regard to the above.

“I recommend that consideration is given to enacting such legislation as may be necessary to ensure the protection of children from the effects of harmful online content and the effective regulation of harmful online content.

“Although regulation would be a matter for Government I can see no reason why the platforms themselves would not wish to give consideration to self-regulation taking into account the matters raised above.”

Mr Walker said he believed action should be taken in order to prevent future deaths, adding: “I believe you and/or your organisation have the power to take such action.”

Ian Russell
Molly’s father Ian Russell has urged social media companies to take immediate action (PA)

Reacting to the recommendations, Molly’s father Ian Russell said: “We welcome this report by the coroner, which echoes our concerns about the online dangers Molly was exposed to, and pushed towards by the platforms’ algorithms.

“We urge social media companies to heed the coroner’s words and not drag their feet waiting for legislation and regulation, but instead to take a proactive approach to self-regulation to make their platforms safer for their young users.

“They should think long and hard about whether their platforms are suitable for young people at all.

“The Government must also act urgently to put in place its robust regulation of social media platforms to ensure that children are protected from the effects of harmful online content, and that platforms and their senior managers face strong sanctions if they fail to take action to curb the algorithmic amplification of destructive and extremely dangerous content or fail to remove it swiftly.

“I hope this will be implemented swiftly through the Online Safety Bill which must be passed as soon as possible.”

In their response to the PFD report, Instagram’s parent company Meta said they agreed “regulation is needed”.

The social media giant said it was “reviewing” the coroner’s report, adding: “We don’t allow content that promotes suicide or self-harm, and we find 98% of the content we take action on before it’s reported to us.

“We’ll continue working hard, in collaboration with experts, teens and parents, so we can keep improving.”

Pinterest also issued a statement in reaction to the report, which said: “Pinterest is committed to making ongoing improvements to help ensure that the platform is safe for everyone and the coroner’s report will be considered with care.”

Meta, Pinterest, Twitter and Snapchat all have 56 days to respond with a timetable of action they propose to take or explain why no action is proposed.

By Press Association

More Technology News

See more More Technology News

Microsoft surface tablets

Microsoft outage still causing ‘lingering issues’ with email

The Google logon on the screen of a smartphone

Google faces £7 billion legal claim over search engine advertising

Hands on a laptop

Estimated 7m UK adults own cryptoassets, says FCA

A teenager uses his mobile phone to access social media,

Social media users ‘won’t be forced to share personal details after child ban’

Google Antitrust Remedies

US regulators seek to break up Google and force Chrome sale

Jim Chalmers gestures

Australian government rejects Musk’s claim it plans to control internet access

Graphs showing outages across Microsoft

Microsoft outage hits Teams and Outlook users

A person holds an iphone showing the app for Google chrome search engine

Apple and Google ‘should face investigation over mobile browser duopoly’

UK unveils AI cyber defence lab to combat Russian threats, as minister pledges unwavering support for Ukraine

British spies to ramp up fight against Russian cyber threats with launch of cutting-edge AI research unit

Pat McFadden

UK spies to counter Russian cyber warfare threat with new AI security lab

Openreach van

Upgrade to Openreach ultrafast full fibre broadband ‘could deliver £66bn boost’

Laptop with a virus warning on the screen

Nato countries are in a ‘hidden cyber war’ with Russia, says Liz Kendall

Pat McFadden

Russia prepared to launch cyber attacks on UK, minister to warn

A Google icon on a smartphone

Firms can use AI to help offset Budget tax hikes, says Google UK boss

Icons of social media apps, including Facebook, Instagram, YouTube and WhatsApp, are displayed on a mobile phone screen

Growing social media app vows to shake up ‘toxic’ status quo

Will Guyatt questions who is responsible for the safety of children online

Are Zuckerberg and Musk responsible for looking after my kids online?