
Vanessa Feltz 3pm - 6pm
10 March 2025, 11:36 | Updated: 12 March 2025, 08:16
Online Safety Day is today, Monday 10th March from 7am to midnight on LBC, available across the UK on Global Player on your smart speaker, iOS or Android device; on DAB digital radio and TV, at LBC.co.uk and in London on 97.3 FM.
LBC can reveal there's been a four-fold increase in the number of AI-generated child sexual abuse images discovered online over the last twelve months.
The majority of these images are surfacing on the clear web, and, in many cases, AI is being used to manipulate images of previous abuse victims.
Speaking exclusively with the Internet Watch Foundation for LBC’s Online Safety Day, we have found:
Read more: Smoking-style warnings on social media backed by seven out of ten adults, LBC poll finds
Read more: LBC Investigates: What a 13-Year-Old Girl Sees on TikTok
Read more: Online Safety Act - everything you need to know
Read more: Online world is ‘free-for-all’ for offenders, sexual abuse survivor says
Interim CEO of the IWF, Derek Ray-Hill, told LBC: “It would be no exaggeration to say that child sexual abuse material is proliferating at an almost uncontrollable rate online, particularly with the advent of artificial intelligence and the ability for people to access it relatively cheaply.”
The Internet Watch Foundation is one of only a handful of organisations around the world with the power to proactively search for harmful material, but the challenge is enormous.
“We don't know the full quantum of child sexual abuse material that's out there in the online universe. No one does. We only know what we encounter, what we identify and what we remove. What we do know is that in the last year we have seen a fourfold increase of images that involve the use of AI technology. So, we see both trends growing at an alarming rate.”
Through AI manipulation, offenders are being found to use pictures of previous victims of child sexual abuse to create new scenarios, new abuse images, and in some cases create tailored abuse images to feed whatever the desires of the person buying them wants. Last year, Hugh Nelson, 27, from Bolton, was jailed for 18 years for doing exactly that.
Ofcom Chief Dame Melanie Dawes speaks to Nick Ferrari on Online Safety Day
Derek Ray-Hill added: “The really horrific nature of this crime is that you can be a victim, your perpetrator can be identified, found and punished, serve their punishment, and you can still be being re-victimised years later online as a result of those original images and online content of your abuse being shared with perpetrators around the world.
Now that you have artificial intelligence, you have the ability to recreate artificial images based on that original abuse and we are talking about a very sick community of people that exist in the dark shadows of the online universe and are a form of collector. So, the second you recreate an image of someone who is a historic victim, you give them another sick collector's item.”
Hannah Swirsky is Head of Policy at the Internet Watch Foundation, she told LBC: “Over the last two years, our analysts have seen a real development in AI generated child sexual abuse material. So, we've now got to the point where it's very difficult to distinguish whether an image is of a photograph or whether it's AI generated.
“It's also really important to note that it's actually now becoming a lot more visible on the clear web, so this isn't just on the darkest corners of the of the Internet. On the dark web, we're also seeing these images being shared on the clear web as well.
“You see a whole range of different types of AI generated child sexual abuse. Sometimes it's famous children, for example, that then their faces are being used and turned into child sexual abuse images, or nudifying tools to nudify an image of a child to then turn it into an abusive and illegal image. Also, images of known victims or survivors. The offenders are using those images and using AI to create new images or new scenarios of those victims and survivors.
Bereaved parents ‘losing trust’ in Government over online safety, says campaigner
"We’re seeing images of real children going through nudification tools, so this could be your own child or another child that their images are put on a website and it's an innocent image of them fully clothed, but AI tools are being used to to nudify that child for example, and this can lead in some cases to sexual extortion, which is a really concerning and worrying trend, where children are being blackmailed.
"So, even though it might not be a real image of that child, offenders and criminals are still attempting to blackmail children for threat of sharing those images with their peers, for example.”
The Internet Watch Foundation has written to the Prime Minister to express concern about a number of loopholes they have identified within the Online Safety Act, Hannah said: “So I think for us, while the OSA is a really important and really landmark piece of legislation, the challenge is always going to be that the pace at which technology is developing we need to be able to keep up with that and the legislative process often takes a lot longer.
So, for example, when it comes to private communications, we need there to be more legislation to explicitly cover private communications, because we do know that often it's through private messaging, for example, where offenders might be grooming children and then coercing them and in order to generate child sexual abuse materials as well.
CEO Derek Ray Hill, added: “There's no doubt that more people will be safer online as a result of the online safety. But by its very nature, it's taken years to consult on years to get through Parliament, which is correct and constitutional. But it does mean it's out of date. The Prime Minister can influence the legislative agenda to put top improvements to the OSA that will remove ‘Safe Harbour’ and 'Technically Feasible' aspects.
“Ten years ago Keir Starmer, as Director of Public Prosecutions, took a stand, giving the IWF unprecedented powers to proactively hunt down child sexual abuse imagery online. Now, we need him to act decisively again. The new regulations we’ve all worked so hard to bring in threaten to leave gaping loopholes for criminals to exploit.
“The Safe harbour principle effectively says that if you're trying to follow the rules, even if you let things slip through your fingers or you're not making every advancement, you're effectively okay.
"That gives major technology platforms, some of the world's most sophisticated technology platforms, with some of the most successful development teams in the world, a get out of jail clause, quite literally, that is unacceptable. These people should be setting the bar higher.
"These development teams that are developing the next great add on to a messenger service need to be developing the same to keep children safe online to identify and remove child sexual abuse material and all sorts of other online harm. The safe harbour principle takes the pressure off them, doing so.
"The second thing is we need to remove the clause that makes it technically feasible. We want to ensure that the platforms don't get to decide whether or not they're meeting the bar.”