Ofcom’s new online harms rules for social media firms disappoint campaigners

16 December 2024, 12:14

A girl using a mobile phone
Girl using mobile phone. Picture: PA

Platforms now have until March to comply with the new Online Safety Act rules or face large fines.

The first set of new online safety rules legally requiring social media and other sites to take action against illegal content have been published by Ofcom.

The regulator said platforms now have three months to assess the risk of their users encountering illegal content and implement safety measures to mitigate those risks, or face enforcement action if they fail to comply with their new duties once they come into force.

The first set of rules focuses on illegal harms – such as terror, hate, fraud, child sexual abuse and encouraging suicide – but one safety charity has criticised the publication, saying it will allow “preventable illegal harm to continue to flourish”.

Ofcom has the power to fine firms up to £18 million or 10% of their qualifying global turnover under the Online Safety Act – whichever is greater – and in very serious cases can apply for sites to be blocked in the UK.

However, The Molly Rose Foundation, which was set up by the family of Molly Russell, who ended her life when she was 14 in 2017 after viewing suicide content on social media, said it was “astonished” and “disappointed” at Ofcom’s first set of codes.

“Ofcom’s task was to move fast and fix things but instead of setting an ambitious precedent these initial measures will mean preventable illegal harm can continue to flourish,” the charity’s chief executive Andy Burrows said.

“While we will analyse the codes in full, we are astonished and disappointed there is not one single targeted measure for social media platforms to tackle suicide and self-harm material that meets the criminal threshold.

“Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life.

“Today makes clear that there are deep structural issues with the Online Safety Act. The Government must commit to fixing and strengthening the regime without delay.”

Ofcom chief executive, Dame Melanie Dawes, said: “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.

“The safety spotlight is now firmly on tech firms and it’s time for them to act.

“We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.

“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”

Technology Secretary Peter Kyle said the publication of the first set of codes under the Online Safety Act was a “significant step” in making online spaces safer.

“This Government is determined to build a safer online world where people can access its immense benefits and opportunities without being exposed to a lawless environment of harmful content.

“Today we have taken a significant step on this journey.

“Ofcom’s illegal content codes are a material step-change in online safety meaning that from March, platforms will have to proactively take down terrorist material, child and intimate image abuse, and a host of other illegal content, bridging the gap between the laws which protect us in the offline and the online world.

“If platforms fail to step up the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites.

“These laws mark a fundamental reset in society’s expectations of technology companies.

“I expect them to deliver and will be watching closely to make sure they do.”

By Press Association

More Technology News

See more More Technology News

A Facebook home page on a laptop screen

Meta fined more than 250 million euro by Irish data commission following breach

Finger poised above WhatsApp app on smartphone

Ending use of WhatsApp is ‘clear admission’ Government was wrong, claim Tories

Phone with WhatsApp on the screen

Scottish Government to cease use of WhatsApp by spring, says Forbes

Open AI

OpenAI rolls out ChatGPT search engine tool to all users

Most people happy to share health data to develop artificial intelligence

Government launches consultation on copyrighted material being used to train AI

Debbie Weinstein

Google names UK executive as president for Europe, Middle East and Africa

The Apple App store app on an iPad (PA)

Shopping and Roblox named among most popular Apple App Store downloads of 2024

A young child lies on a couch while playing on a smartphone

Q&A: Ofcom, the Online Safety Act, and codes of practice for social media

A man in a hoodie in front of several computer monitors

Peers urge ministers to step-up efforts to criminalise deepfake abuse

Exclusive
‘The law is really slow in catching up’: Woman fights for justice after friend made deepfake porn of her

‘The law is really slow in catching up’: Woman fights for justice after friend made deepfake porn of her

Charles, left, and Tim Cook

King lauds ‘fantastic’ AI tool at Apple headquarters in Battersea Power Station

Google Android XR

Google announces Android XR operating system for new headsets and smart glasses

The OpenAI logo is displayed on a mobile phone

OpenAI hit by outage after ‘technical issue’ impacts Meta social media apps

Medical equipment in a hospital

NHS uses AI to find ‘frequent attenders’ in A&E

Mobile phone screen showing apps

Meta ‘99%’ finished fixing ‘technical issue’ that caused widespread blackout

Facebook, WhatsApp, Messenger and Instagram icons

Facebook, Instagram and WhatsApp faces 'technical issue' as thousands report outage