Vanessa Feltz 3pm - 6pm
More than 250 UK celebrities are victims of deepfake pornography, probe finds
21 March 2024, 20:54
Channel 4 news presenter Cathy Newman watched footage of her own image superimposed on to pornography and said it felt like a violation.
More than 250 British celebrities have been victims of deepfake pornography, according to an investigation by Channel 4 News.
Among them is the channel’s news presenter, Cathy Newman, who said she felt violated on watching digitally altered footage in which her face was superimposed on to pornography using artificial intelligence (AI).
The broadcaster, which aired its investigation on Thursday evening, said it did an analysis of the five most visited deepfake websites and found 255 of the almost 4,000 famous individuals listed were British, with all but two being women.
In her report, Newman watched the deepfake footage of herself and said: “It feels like a violation.
“It just feels really sinister that someone out there who’s put this together, I can’t see them, and they can see this kind of imaginary version of me, this fake version of me.
“You can’t unsee that. That’s something that I’ll keep returning to.
“And just the idea that thousands of women have been manipulated in this way, it feels like an absolutely gross intrusion and violation.
“It’s really disturbing that you can, at a click of a button, find this stuff, and people can make this grotesque parody of reality with absolute ease.”
Channel 4 News said it contacted more than 40 celebrities for the investigation, all of whom were unwilling to comment publicly.
The broadcaster also said it found that more than 70% of visitors arrived at deepfake websites using search engines like Google.
Advances in AI have made it easier to create digitally altered and fake images.
Industry experts have warned of the danger posed by AI-generated deepfakes and their potential to spread misinformation, particularly in a year that will see major elections in many countries, including the UK and the US.
Earlier in the year, deepfake images of pop star Taylor Swift were posted to X, formerly Twitter, and the platform blocked searches linked to the singer after fans lobbied the Elon Musk-owned platform to take action.
The Online Safety Act makes it a criminal offence to share, or threaten to share, a manufactured or deepfake intimate image or video of another person without his or her consent but it is not intended to criminalise the creation of such deepfake content.
In its investigation, Channel 4 News claimed the most targeted individuals of deepfake pornography are women who are not in the public eye.
Newman spoke to Sophie Parrish, who started a petition before the law was changed, after the person who created digitally altered pornography of her was detained by police but did not face any further legal action.
She told the PA news agency in January that she was sent Facebook messages from an unknown user, which included a video of a man masturbating over her and using a shoe to pleasure himself.
“I felt very, I still do, dirty – that’s one of the only ways I can describe it – and I’m very ashamed of the fact that the images are out there,” she said.
Tory MP Caroline Nokes, who is chairwoman of the Women And Equalities Committee, told Channel 4 News: “It’s horrific… this is women being targeted.
“We need to be protecting people from this sort of deepfake imagery that can destroy lives.”
In a statement to the news channel, a Google spokesperson said: “We understand how distressing this content can be, and we’re committed to building on our existing protections to help people who are affected.
“Under our policies, people can have pages that feature this content and include their likeness removed from Search.
“And while this is a technical challenge for search engines, we’re actively developing additional safeguards on Google Search – including tools to help people protect themselves at scale, along with ranking improvements to address this content broadly.”
Ryan Daniels, from Meta, said in a statement to the broadcaster: “Meta strictly prohibits child nudity, content that sexualises children, and services offering AI-generated non-consensual nude images.”
Elena Michael, a campaigner from the group NotYourPorn, told Channel 4 News: “Platforms are profiting off this kind of content.
“And not just porn companies, not just deepfake porn companies, social media sites as well. It pushes traffic to their site. It boosts advertising.”