Change is finally on the horizon as UK to criminalise sexually explicit 'deepfakes' - and victims deserve nothing less

7 January 2025, 19:28 | Updated: 7 January 2025, 19:42

v
The UK Government announced plans to criminalise the creation of sexually explicit deepfakes. Picture: Alamy

By Kelsey Farish

Imagine waking up to find a hyper-realistic, sexually explicit image of yourself circulating online—one you never consented to, and that isn’t even real.

Listen to this article

Loading audio...

This is the horrifying reality for victims of AI-generated intimate image abuse, also known as “deepfake porn.”

But today, 7 January 2025, the UK Government has announced its plans to criminalise the creation of such sexually explicit deepfakes.

This sort of content began with “face-swapping” videos of famous women actors such as  Scarlett Johansson, Gal Gadot and Emma Watson in 2017, but has since spread from quiet corners of the dark web to mainstream social media and messaging apps.

And terrifyingly, anyone – not just famous Hollywood stars – can be victimised.

Fairly convincing deepfakes can be made with just one or two photographs and inexpensive mobile apps; and more sophisticated videos made with desktop software can rival Hollywood studios.

With the technology becoming better and easier to use, it’s no wonder that intimate image abuse is spreading like wildfire.

Research suggests that deepfake pornography makes up 98% of all deepfake videos online, and of these, virtually all of the individuals depicted are women or girls.

Unfortunately, deepfakes are often created anonymously and rapidly shared across a globalised Internet which is increasingly resistant to regulation.

Even with the best laws on the statute books, effective solutions require international co-operation and the ability to hold tech companies accountable.

At the same time, the law must balance protection with freedom of expression.

The technology itself is not always harmful: filmmakers, educators, and artists and beyond can use AI depicting individuals for legitimate and creative purposes.

This highlights the importance of both intent and consent: satire and artistic expression must remain protected, while non-consensual sexual exploitation must be unequivocally condemned.

This legal complexity has meant that the law has struggled to keep pace with technological advances, leaving victims with little to no protection against non-consensual sexual deepfakes.

But now, thankfully things are starting to change.

The creation of new offences, to be introduced as part of the upcoming Crime and Policing Bill, aim to create a holistic legal framework for tackling image-based sexual abuse.

They include specific offences for taking or recording intimate images without consent, replacing outdated voyeurism laws under the Sexual Offences Act 2003.

Amongst other reforms, the proposed protections will be expanded to apply to images of adults, as existing legislation already covers similar behaviours involving children under 18.

Additionally, the changes will criminalise related actions, such as installing hidden camera equipment with the intent to take intimate images without consent.

Of course, criminalising this deplorable behaviour is only part of the solution: the law requires effective and practical enforcement mechanisms.

The Online Safety Act 2024 requires certain Internet platforms to take proactive measures against harmful content, which means social media companies have a legal duty to prevent the spread of non-consensual sexual deepfakes.

Failure to act can result in substantial fines and enforcement action by Ofcom, shifting the burden of responsibility away from victims and onto on the tech companies.

Legislation like this sends a powerful message that the creation of non-consensual sexual deepfakes is no longer a legal grey area.

The journey to effective enforcement will be challenging, but change is finally on the horizon—and victims deserve nothing less.

________________

Kelsey Farish is a media and entertainment solicitor who specialises in AI-generated content and publicity laws.

LBC Views provides a platform for diverse opinions on current affairs and matters of public interest.

The views expressed are those of the authors and do not necessarily reflect the official LBC position.

To contact us email views@lbc.co.uk