UK police warn of bias in AI tools

Legal pitfalls and the potential to point resources in the wrong area make law enforcement wary of the tech

Police officer

UK police have said that a lack of consistent guidelines for automated technology has them wary of its potential to "amplify" prejudices, according to a report by the Centre for Data Ethics and Innovation (CDEI).

Data analytics can recommend police actions like stop and search, which could exacerbate biases, the report found. Machine-learning tools that use existing records to populate their datasets may derive prejudices from the arresting officers, teaching them to suspect certain demographics more frequently than others.

Software can also predict crime hot-spots and advise police to send officers, leading to more arrests but at the cost of policing in other locations

"We pile loads of resources into a certain area and it becomes a self-fulfilling prophecy, purely because there's more policing going into that area," one anonymous interviewee said.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

The CDEI urged police forces to consider how algorithmic bias may impact their allocation of resources in this way.

The study also warned that algorithmic fairness is not merely a matter of data bias, but demands to be considered within a broader "operational, organisational, and legal context" to prevent analytics from dominating decision-making.

"Officers often disagree with the algorithm," said one of the study's participants, "The point where you don't get that challenge, that's when people are putting that professional judgement aside."

Without such professional judgement, the CDEI anticipates a spike in discrimination cases brought by those who were scored "negatively" by automated policing tools.

"Given the lack of any government policy for police use of data analytics, it means that police forces are going to be reluctant to innovate," said Alexander Babuta, research fellow at the Royal United Service Institute.

Andy Davies, consultant for the Police and Intelligence Services at SAS UK and Ireland, added: "[AI] should not be used as a standalone solution. Rather, this new technology should complement the work of the emergency services by providing them with the necessary insights that they need to make informed, and potentially life-saving decisions."

Advertisement - Article continues below

Earlier this year, the ICO's executive director for technology policy and innovation, Simon McDougall, proposed that GDPR principles be applied to AI, including the right to object to profiling and the right to dispute decisions made exclusively by a machine.

The Royal United Service Institute will publish a final report for the CDEI in early 2020, which will include a Code of Practice for the use of artificial intelligence in the police force.

Featured Resources

Digitally perfecting the supply chain

How new technologies are being leveraged to transform the manufacturing supply chain

Download now

Three keys to maximise application migration and modernisation success

Harness the benefits that modernised applications can offer

Download now

Your enterprise cloud solutions guide

Infrastructure designed to meet your company's IT needs for next-generation cloud applications

Download now

The 3 approaches of Breach and Attack Simulation technologies

A guide to the nuances of BAS, helping you stay one step ahead of cyber criminals

Download now
Advertisement

Most Popular

Visit/operating-systems/25802/17-windows-10-problems-and-how-to-fix-them
operating systems

17 Windows 10 problems - and how to fix them

13 Jan 2020
Visit/hardware/354584/windows-10-and-the-tools-for-agile-working
Sponsored

Windows 10 and the tools for agile working

20 Jan 2020
Visit/microsoft-windows/32066/what-to-do-if-youre-still-running-windows-7
Microsoft Windows

What to do if you're still running Windows 7

14 Jan 2020
Visit/web-browser/30394/what-is-http-error-503-and-how-do-you-fix-it
web browser

What is HTTP error 503 and how do you fix it?

7 Jan 2020