Q&A: Ofcom, the Online Safety Act, and codes of practice for social media

16 December 2024, 14:04

A young child lies on a couch while playing on a smartphone
Online Safety Act. Picture: PA

Ofcom has published the first set of new rules to regulate social media.

Online safety regulator Ofcom has published its first set of codes and guidance under the Online Safety Act, setting out the duties tech firms must comply with regarding illegal harms.

The landmark safety laws will require platforms to put a range of safety measures in place which Ofcom says will help better protect users – including better moderation, built-in safety tools, clear ways to report harmful content and clear accountability to senior staff over safety compliance issues.

Here is a closer look at Ofcom’s announcement and the wider legislation.

– What is the Online Safety Act?

Passed in late 2023, the Online Safety Act is the UK’s first major legislation to regulate social media, search engine, messaging, gaming, dating, pornography and file-sharing platforms.

At its core, the Act places a range of new safety duties on sites, which will compel them to protect users from illegal and other harmful content.

It will do so by putting robust safety features in place to prevent the content appearing on sites in the first place, also acting swiftly to remove it when it does.

The new duties will be set out in a range of codes of practice and other guidance published by Ofcom over the coming months, with each one focusing on a specific content area.

The Act gives Ofcom the power to fine firms that fail to meet these duties – potentially up to billions of pounds for the largest sites – and in serious cases can seek clearance to block access to a site in the UK.

– What has Ofcom published now?

The regulator has released its first codes of practice, which specifically focus on illegal harms online.

This is content such as that linked to terrorism, hate, fraud, child sexual abuse and assisting or encouraging suicide, Ofcom says.

The codes are designed to help platforms comply with the new rules by setting out best practice on the measures and structures they should have in place by the time the duties are expected to come into force in three months’ time.

The largest platforms will be expected to do the most to protect users, in particular children.

The codes of practice outline that sites should have senior staff accountability for safety, have strong moderation and reporting tools in place, as well as robust measures to protect children from abuse and exploitation online.

The first set of codes also call for measures to be put in place to tackle pathways to online grooming, use automated tools to detect child sexual abuse material and take steps to protect women and girls, identify fraud and remove terrorist accounts.

– What has been the response?

While many have welcomed steps being taken to better regulate social media, some campaigners have expressed their disappointment at Ofcom’s approach, arguing it has not been forceful enough.

The Molly Rose Foundation, which was set up by the family of Molly Russell, the 14-year-old who ended her life in 2017 after viewing suicide content on social media, said it was “astonished” and disappointed” with Ofcom’s codes, adding there is not “one single targeted measure” for social media sites to “tackle suicide and self-harm material that meets the criminal threshold”.

Maria Neophytou, acting chief executive at the National Society for the Prevention of Cruelty to Children, said the charity was “deeply concerned that some of the largest services will not be required to take down the most egregious forms of illegal content, including child sexual abuse material”.

She said Ofcom’s proposals will “at best lock in the inertia to act and at worst create a loophole which means services can evade tackling abuse in private messaging without fear of enforcement”.

– What has Ofcom said about the codes?

Dame Melanie Dawes, Ofcom chief executive, said the introduction of the Online Safety Act meant that sites would no longer be “unregulated, unaccountable and unwilling to prioritise people’s safety over profits”.

“The safety spotlight is now firmly on tech firms and it’s time for them to act,” she said.

“We’ll be watching the industry closely to ensure firms match-up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”

Ofcom has said it will continue to roll out further codes and proposals in 2025, including more on the response to child sexual abuse material.

– What happens next?

For tech firms, they now have until March to start putting Ofcom’s proposals into place on their sites to ensure they are in line with those aspects of the Online Safety Act as it begins to come into force.

Meanwhile, Ofcom has said it will continue to publish more codes of practice in the early months of next year on a range of other harms included in the Act, including guidance for pornography publishers expected in January, guidance on protecting women and girls in February, and details on additional protection for children around harmful content promoting suicide, self-harm and eating disorders in April.

By Press Association

More Technology News

See more More Technology News

A Facebook home page on a laptop screen

Meta fined more than 250 million euro by Irish data commission following breach

Finger poised above WhatsApp app on smartphone

Ending use of WhatsApp is ‘clear admission’ Government was wrong, claim Tories

Phone with WhatsApp on the screen

Scottish Government to cease use of WhatsApp by spring, says Forbes

Open AI

OpenAI rolls out ChatGPT search engine tool to all users

Most people happy to share health data to develop artificial intelligence

Government launches consultation on copyrighted material being used to train AI

Debbie Weinstein

Google names UK executive as president for Europe, Middle East and Africa

The Apple App store app on an iPad (PA)

Shopping and Roblox named among most popular Apple App Store downloads of 2024

A girl using a mobile phone

Ofcom’s new online harms rules for social media firms disappoint campaigners

A man in a hoodie in front of several computer monitors

Peers urge ministers to step-up efforts to criminalise deepfake abuse

Exclusive
‘The law is really slow in catching up’: Woman fights for justice after friend made deepfake porn of her

‘The law is really slow in catching up’: Woman fights for justice after friend made deepfake porn of her

Charles, left, and Tim Cook

King lauds ‘fantastic’ AI tool at Apple headquarters in Battersea Power Station

Google Android XR

Google announces Android XR operating system for new headsets and smart glasses

The OpenAI logo is displayed on a mobile phone

OpenAI hit by outage after ‘technical issue’ impacts Meta social media apps

Medical equipment in a hospital

NHS uses AI to find ‘frequent attenders’ in A&E

Mobile phone screen showing apps

Meta ‘99%’ finished fixing ‘technical issue’ that caused widespread blackout

Facebook, WhatsApp, Messenger and Instagram icons

Facebook, Instagram and WhatsApp faces 'technical issue' as thousands report outage