Peers challenge police use of artificial intelligence

31 March 2022, 08:52 | Updated: 25 July 2023, 11:52

Facial recognition technology in use in London
Facial Recognition Technology. Picture: PA

The Lords Justice and Home Affairs Committee raised concerns about a lack of oversight for new technologies used by law enforcement agencies.

Law enforcement agencies’ use of artificial intelligence and facial recognition technology are not subject to proper oversight and risk exacerbating discrimination, peers have warned.

New technologies were being created in a “new Wild West” without the law and public awareness keeping up with developments, a parliamentary committee said.

The Lords Justice and Home Affairs Committee warned that the lack of oversight meant “users are in effect making it up as they go along”.

Facial Recognition Technology
A board detailing facial recognition technology in use in Leicester Square, London (Kirsty O’Connor/PA)

The cross-party group said AI had the potential to improve people’s lives but could have “serious implications” for human rights and civil liberties in the justice system.

“Algorithms are being used to improve crime detection, aid the security categorisation of prisoners, streamline entry clearance processes at our borders and generate new insights that feed into the entire criminal justice pipeline,” the peers said.

Scrutiny was not happening to ensure new tools were “safe, necessary, proportionate and effective”.

“Instead, we uncovered a landscape, a new Wild West, in which new technologies are developing at a pace that public awareness, government and legislation have not kept up with.”

Police forces and other agencies were buying equipment in a “worryingly opaque” market, with details of how systems work kept secret due to firms’ insistence on commercial confidentiality.

The peers also had concerns about AI being used in “predictive policing” – forecasting crime before it happened.

There was a danger it could make problems of discrimination worse by embedding in algorithms the “human bias” contained in the original data.

Professor Karen Yeung, an expert in law, ethics and informatics at the University of Birmingham, told the committee that “criminal risk assessment” tools were not focused on white-collar crimes such as insider trading, due to the lack of data, but were instead focused on the kind of crimes for which there was more information.

Prof Yeung said: “This is really pernicious. We are looking at high-volume data that is mostly about poor people, and we are turning it into prediction tools about poor people.

“We are leaving whole swathes of society untouched by those tools.”

The peers called for a mandatory register of algorithms used in criminal justice tools, a national body to set standards and certify new technology and new local ethics committees to oversee its use.

Baroness Hamwee, the Liberal Democrat chairwoman of the committee, said: “What would it be like to be convicted and imprisoned on the basis of AI which you don’t understand and which you can’t challenge?

“Without proper safeguards, advanced technologies may affect human rights, undermine the fairness of trials, worsen inequalities and weaken the rule of law. The tools available must be fit for purpose, and not be used unchecked.”

By Press Association

More Technology News

See more More Technology News

Prime Minister hosts Chanukah reception

AI tech giants should not be subsidised by British creatives, Starmer signals

Dr Craig Wright arrives at the Rolls Building in London for the trial earlier this year (Lucy North/PA)

Computer scientist behind false Bitcoin founder claim sentenced for contempt

Google has been contacted for comment (PA)

ICO criticises Google over ‘irresponsible’ advertising tracking change

Some 22% of consumers have increased their use of second-hand shopping apps in the past three months (Depop/PA)

Millions of Britons earning average £146 a month on second-hand platforms

ChatGPT being used via WhatsApp

ChatGPT joins WhatsApp to allow anyone to access the AI chatbot

A Facebook home page on a laptop screen

Meta fined more than 250 million euro by Irish data commission following breach

Finger poised above WhatsApp app on smartphone

Ending use of WhatsApp is ‘clear admission’ Government was wrong, claim Tories

Phone with WhatsApp on the screen

Scottish Government to cease use of WhatsApp by spring, says Forbes

Open AI

OpenAI rolls out ChatGPT search engine tool to all users

Most people happy to share health data to develop artificial intelligence

Government launches consultation on copyrighted material being used to train AI

Debbie Weinstein

Google names UK executive as president for Europe, Middle East and Africa

The Apple App store app on an iPad (PA)

Shopping and Roblox named among most popular Apple App Store downloads of 2024

A young child lies on a couch while playing on a smartphone

Q&A: Ofcom, the Online Safety Act, and codes of practice for social media

A girl using a mobile phone

Ofcom’s new online harms rules for social media firms disappoint campaigners

A man in a hoodie in front of several computer monitors

Peers urge ministers to step-up efforts to criminalise deepfake abuse

Exclusive
‘The law is really slow in catching up’: Woman fights for justice after friend made deepfake porn of her

‘The law is really slow in catching up’: Woman fights for justice after friend made deepfake porn of her