Meta’s ‘bonfire’ of safety policies a danger to children, charity says

28 January 2025, 07:14

A girl using a mobile phone
Girl using mobile phone. Picture: PA

The Molly Rose Foundation has warned social media risks becoming a haven for harmful content if safety regulation is not bolstered.

Meta’s recent “bonfire of safety measures” risks taking Facebook and Instagram back to where they were when Molly Russell died, the charity set up in her name has warned.

The Molly Rose Foundation said new online safety regulator Ofcom must strengthen incoming regulation in order to ensure teenagers are protected from harmful content online.

The charity was set up by Molly’s family after her death in 2017, aged 14, when Molly chose to end her life after viewing harmful content on social media sites, including Meta-owned Instagram.

Molly Russell
Molly Russell (Family handout/PA)

Earlier this month, boss Mark Zuckerberg announced sweeping changes to Meta’s policies in the name of “free expression”, including plans to scale back content moderation that will see the firm ending the automated scanning of content for some types of posts, instead relying on user reports to remove certain sorts of content.

Campaigners called the move “chilling” and said they were “dismayed” by the decision, which has been attributed to Mr Zuckerberg’s desire to forge a positive relationship with new US President Donald Trump.

Andy Burrows, chief executive of the Molly Rose Foundation, said: “Meta’s bonfire of safety measures is hugely concerning and Mark Zuckerberg’s increasingly cavalier choices are taking us back to what social media looked like at the time that Molly died.

“Ofcom must send a clear signal it is willing to act in the interests of children and urgently strengthen its requirements on tech platforms.

“If Ofcom fails to keep pace with the irresponsible actions of tech companies the Prime Minister must intervene.

“Amid a strategic rollback of their safety commitments, preventable harm is being driven by Silicon Valley but the decision to stop it in its tracks now sits with the regulator and Government.”

In a letter sent to Ofcom, the foundation has urged Ofcom to strengthen the Online Safety Act by bolster requirements around content moderation, including requiring firms to proactively scan for all types of intense depression, suicide and self-harm content.

It also urges the regulator to ensure that Meta’s new loosened policies around hate speech are not allowed to apply to children, and gain clarification on whether Meta can change its rules without going through traditional internal processes, after reports suggesting Mr Zuckerberg made the policy changes himself, leaving internal teams “blindsided” – something Ofcom should ensure cannot happen again, the foundation said.

In a statement, a Meta spokesperson said: “There is no change to how we define and treat content that encourages suicide, self-injury and eating disorders.

“We don’t allow it and we’ll continue to use our automated systems to proactively identify and remove it.

“We continue to have Community Standards, around 40,000 people working on safety and security to help enforce them, and Teen Accounts in the UK, which automatically limit who can contact teens and the types of content they see.”

Earlier this month, Molly’s father Ian, the chairman of the Molly Rose Foundation, told the Prime Minister that the UK was “going backwards” on online safety.

Mr Russell said in a letter to Sir Keir Starmer that Ofcom’s approach to implementing the Online Safety Act has “fundamentally failed to grasp the urgency and scale of its mission”, and changes were needed to bolster the legislation.

The Molly Rose Foundation has also previously warned that Meta’s approach to tackling suicide and self-harm content is not fit for purpose, after research found the social media giant was responsible for just 2% of industry-wide takedowns of such content.

We are in contact with social media companies, including Meta, about the safety measures they have in place now, and what more they will have to do to comply once the duties are fully in force

Ofcom

An Ofcom spokesperson said: “All platforms operating in the UK – including Meta – must comply with the UK’s online safety laws, once in force.

“Under the Online Safety Act, tech firms must assess the risks they pose, including to children, and take significant steps to protect them.

“That involves acting swiftly to take down illegal content – including illegal suicide and self-harm material – and ensuring harmful posts and videos are filtered out from children’s feeds.

“We’ll soon put forward additional measures for consultation on the use of automated content moderation systems to proactively detect this kind of egregious content.

“We are in contact with social media companies, including Meta, about the safety measures they have in place now, and what more they will have to do to comply once the duties are fully in force.

“No one should be in any doubt about Ofcom’s resolve to hold tech firms to account, using the full force of our enforcement powers where necessary.”

By Press Association

More Technology News

See more More Technology News

A Barclays sign outside a branch

Barclays to hand share award to staff after yearly profit surges by a quarter

A bin of seized knives. A new AI tool from the University of Surrey has been unveiled which could help police forces more quickly identify and trace knives.

New AI tool to identify knives could ‘transform’ policing of knife crime

Former executive chairman of Google Eric Schmidt

Former Google boss warns of ‘extreme risk’ from terrorists posed by AI

A laptop displaying a ‘Matrix’-style screensaver

MPs: Ministers must give protections to creative sector amid AI copyright fears

French President Emmanuel Macron addresses the audience in a closing speech at the Grand Palais during the Artificial Intelligence Action Summit in Paris

Refusal to sign AI declaration was ‘based on what’s best for British people’

Someone at a computer keyboard

Airbnb issues warning over holiday scams fuelled by AI and social media

An HSBC branch

HSBC online and mobile banking working again after service outage

HSBC on growth across the UK

HSBC hit by outage as users complain of being unable to log on

The summit in Paris (Michel Euler/AP)

UK did not sign AI communique over ‘opportunity and security’ concerns – No 10

Sky Glass Gen 2

Sky unveils second generation Sky Glass TV promising ‘better picture and sound’

Technology Stock

UK announces sanctions against Russian cyber crime network

Participants in the AI Action Summit pose for a group photo at the Grand Palais in Paris

UK appears not to have signed leaders’ declaration at AI summit

OpenAI CEO Sam Altman

Sam Altman reiterates OpenAI ‘not for sale’ after Elon Musk-led bid

A young girl uses the TikTok app on a smartphone.

Data of dead British children may have been deleted, TikTok boss says

Elon Musk

Elon Musk offers $97bn to buy ChatGPT-maker OpenAI

Alesha Dixon (Jordan Pettitt/PA)

Alesha Dixon working ‘super hard’ to stop children having phones