Clive Bull 1am - 4am
Peers urge ministers to step-up efforts to criminalise deepfake abuse
13 December 2024, 14:54
The calls came as the House of Lords considered the Non-Consensual Sexually Explicit Images and Videos (Offences) Bill.
Ministers have been urged to speed-up efforts to criminalise deepfake abuse amid warnings women can no longer choose who owns their naked image.
Baroness Owen of Alderley Edge has proposed criminalising the creation and solicitation of intimate images of people made without their consent.
She has tabled a proposed law in response to concerns over how technology is aiding the abuse of women, with so-called nudify apps allowing users to create fake nude images or videos of other people through generative artificial intelligence.
Lady Owen’s Non-Consensual Sexually Explicit Images and Videos (Offences) Bill, which would apply to England and Wales, aims to create new offences, with those found guilty facing a fine, up to six months in prison or both.
The courts would also be able to order the deletion and destruction of physical and digital images.
Labour’s general election manifesto stated the party would ban the creation of sexually explicit deepfakes, but Lady Owen expressed disappointment at ministers for not supporting her Bill.
Justice minister Lord Ponsonby of Shulbrede offered to meet peers to discuss the Bill and how the Government can “try and meet the objectives of this Bill through other legislation”.
Introducing the Bill, Lady Owen told the House of Lords: “I believe in a woman’s right to choose. The right for her to choose what she does with her own body. The right for her to choose who owns her naked image.
“With the dawn of AI technology, women have lost this ability. A woman can no longer choose who owns an intimate image of her.
“Technology has made it possible for them to be created by anyone, anywhere, at anytime, regardless of whether she consents.
“This Bill will return power to where it belongs – the hands of each individual woman.”
Lady Owen added: “Deepfake abuse is the new frontier of violence against women and the non-consensual creation of a woman’s naked image is an act of abuse.”
The peer said research had found that one app processed 600,000 images in its first three weeks while the largest site “dedicated to deepfake abuse” has 13.4 million hits every month.
She added: “It’s a disproportionately sexist form of abuse with 99% of all sexually explicit deepfakes being of women.
“Women are sick and tired of their images being used without their consent to misrepresent, degrade and humiliate them.”
Peers heard how survivors experience “untold trauma, anxiety and distress” as a result of deepfakes, with Lady Owen adding: “All women are forced to live under the ever-present threat that anyone can own sexually explicit content of them.
“The current law is a patchwork of legislation that cannot keep pace and means we’re forever playing catch-up whilst the abuse of women races ahead in a technological revolution of degradation.”
Lady Owen said the measures in her Bill, if approved by Parliament, would be implemented as soon as it received royal assent.
The former Downing Street special adviser added: “The victims of intimate image abuse have waited long enough.
“Given the rapid proliferation of this abuse, every day that we delay is another day when women have to live under this ever-present threat.”
Lady Owen said she had met ministers to discuss her Bill, adding: “I’m disappointed by their response suggesting they will not support this vital Bill and their apparent willingness to delay on legislating on image-based abuse.”
Liberal Democrat peer Baroness Grender said the Bill is “essential”, adding: “Women can’t suffer delay on this issue.”
The Bill was given a second reading and will undergo further scrutiny at a later date, although it is unlikely to become law in its current form without Government support.