AI action plans should be slowed until safeguards for children in place – NSPCC

28 January 2025, 00:04

A child using a laptop
Social media ban for under-16s. Picture: PA

The children’s charity has called for statutory safeguards around generative AI to help protect youngsters.

The Government should slow down its artificial intelligence action plans until a statutory duty of care for children is in place around the technology, a leading charity has said.

The NSPCC said generative AI is already being used to create sexual abuse images of children, and urged the Government to consider adopting specific safeguards into legislation to regulate AI.

The charity added that it had also found more than three-quarters of the public (78%) would prefer more robust safety checks on new generative AI tools, even if that meant the launch of such products was delayed.

A new study commissioned by the NSPCC also found that 89% of those asked had some level of concern around AI and its potential safety for children.

We can’t continue with the status quo where tech platforms 'move fast and break things' instead of prioritising children’s safety. For too long, unregulated social media platforms have exposed children to appalling harms that could have been prevented

Chris Sherwood

The charity said it had been receiving reports from children about AI through Childline since 2019.

Earlier this month, the Prime Minister announced plans to boost the AI industry in the UK, and to increase its use in daily life, starting with the civil service, as part of wider Government plans to help grow the economy.

The US has also reached announced major investment plans in the technology, which was sparked by the introduction of generative AI chatbot ChatGPT in late 2022, and has since grown into the key innovation area of the tech sector.

Chris Sherwood, the NSPCC’s chief executive, said: “Generative AI is a double-edged sword.

“On the one hand it provides opportunities for innovation, creativity and productivity that young people can benefit from; on the other it is having a devastating and corrosive impact on their lives.

“We can’t continue with the status quo where tech platforms ‘move fast and break things’ instead of prioritising children’s safety.

“For too long, unregulated social media platforms have exposed children to appalling harms that could have been prevented.

“Now the Government must learn from these mistakes, move quickly to put safeguards in place and regulate generative AI, before it spirals out of control and damages more young lives.

“The NSPCC and the majority of the public want tech companies to do the right thing for children and make sure the development of AI doesn’t race ahead of child safety.

The potential for harm is unimaginable. AI companies must prioritise the protection of children and the prevention of AI abuse imagery above any thought of profit

Derek Ray-Hill

“We have the blueprints needed to ensure this technology has children’s wellbeing at its heart, now both Government and tech companies must take the urgent action needed to make generative AI safe for children and young people.”

International conference the AI Action Summit is due to take place in Paris next month.

Derek Ray-Hill, interim chief executive at the Internet Watch Foundation, which seeks out and helps remove child sexual abuse imagery from the internet, said existing laws, as well as future AI legislation, must be made robust enough to ensure children are protected from being exploited by the technology.

“Artificial intelligence is one of the biggest threats facing children online in a generation, and the public is rightly concerned about its impact,” he said.

“While the technology has huge capacity for good, at the moment it is just too easy for criminals to use AI to generate sexually explicit content of children – potentially in limitless numbers, even incorporating imagery of real children. The potential for harm is unimaginable.

“AI companies must prioritise the protection of children and the prevention of AI abuse imagery above any thought of profit. It is vital that models are assessed before they go to market, and rigorous risk mitigation strategies must be in place, with protections built into closed-source models from the outset.

“The upcoming AI Bill is a key opportunity to introduce safeguards for models to prevent the generation of AI-generated child sexual abuse material, and child sexual abuse laws must be updated in line with emerging harms, to prevent AI technology being exploited to create child sexual abuse material.”

By Press Association

More Technology News

See more More Technology News

Meta’s Orion glasses

Smart glasses will be future of computing, Meta executives say

A man in a hoodie in front of several computer monitors

Warning issued about social media and email account hacking after reports jump

Walton Aubrey Webson smiling, wearing grey suit jacket

Blind and partially sighted risk exclusion from AI revolution, diplomat warns

Apps on a mobile phone

Critics say Ofcom is too weak on illegal social media content as new rules start

Technology firms must tackle illegal content on their platforms under new rules, but there are concerns that the changes are too weak.

New Ofcom powers for online safety come into force as charities warn of 'major gaps' in legislation

Exclusive
Jordan Stephens, Rizzle Kicks star.

Rizzle Kicks star says children 'rely' on online communities for connection as he says 'boredom' to blame for rising crime

A message on an iPhone

Media denied entry to tribunal thought to be about Apple and Government data row

Education Secretary Bridget Phillipson (PA)

Disruptive phones have no place in schools, Education Secretary says

A finger hovering over a phone screen with the Facebook, Instagram and WhatsApp logos

Fact check: Hoax posts about killers and stabbings in local Facebook groups

A drone in the air with countryside behind

Drones used to sow tree seeds in scheme to restore lost South West rainforests

ASCL president Manny Botwe

Technology ‘being weaponised’ against schools and teachers – union leader

A woman using a laptop as she holds a bank card

Phishing campaign impersonating Booking.com targeting UK hospitality

Crypto regulation

NCA officer charged following alleged Bitcoin theft

Sir Keir Starmer walking out the door of 10 Downing Street carrying folders under his left arm

Starmer’s plans to shape up ‘flabby’ Civil Service could trigger union clash

A person holds an iphone showing the app for Google chrome search engine

Apple and Google browser dominance harming consumer choice, says watchdog

A. Lunar Eclipse, Red supermoon, Blood moon / 
on 28th September 2015.

Blood moon 2025: Rare lunar eclipse to be visible in the UK this week - here's how to see it