UK police warn of bias in AI tools

Legal pitfalls and the potential to point resources in the wrong area make law enforcement wary of the tech

Police officer

UK police have said that a lack of consistent guidelines for automated technology has them wary of its potential to "amplify" prejudices, according to a report by the Centre for Data Ethics and Innovation (CDEI).

Data analytics can recommend police actions like stop and search, which could exacerbate biases, the report found. Machine-learning tools that use existing records to populate their datasets may derive prejudices from the arresting officers, teaching them to suspect certain demographics more frequently than others.

Advertisement - Article continues below

Software can also predict crime hot-spots and advise police to send officers, leading to more arrests but at the cost of policing in other locations

"We pile loads of resources into a certain area and it becomes a self-fulfilling prophecy, purely because there's more policing going into that area," one anonymous interviewee said.

The CDEI urged police forces to consider how algorithmic bias may impact their allocation of resources in this way.

The study also warned that algorithmic fairness is not merely a matter of data bias, but demands to be considered within a broader "operational, organisational, and legal context" to prevent analytics from dominating decision-making.

"Officers often disagree with the algorithm," said one of the study's participants, "The point where you don't get that challenge, that's when people are putting that professional judgement aside."

Advertisement
Advertisement - Article continues below

Without such professional judgement, the CDEI anticipates a spike in discrimination cases brought by those who were scored "negatively" by automated policing tools.

Advertisement - Article continues below

"Given the lack of any government policy for police use of data analytics, it means that police forces are going to be reluctant to innovate," said Alexander Babuta, research fellow at the Royal United Service Institute.

Andy Davies, consultant for the Police and Intelligence Services at SAS UK and Ireland, added: "[AI] should not be used as a standalone solution. Rather, this new technology should complement the work of the emergency services by providing them with the necessary insights that they need to make informed, and potentially life-saving decisions."

Earlier this year, the ICO's executive director for technology policy and innovation, Simon McDougall, proposed that GDPR principles be applied to AI, including the right to object to profiling and the right to dispute decisions made exclusively by a machine.

The Royal United Service Institute will publish a final report for the CDEI in early 2020, which will include a Code of Practice for the use of artificial intelligence in the police force.

Featured Resources

Preparing for long-term remote working after COVID-19

Learn how to safely and securely enable your remote workforce

Download now

Cloud vs on-premise storage: What’s right for you?

Key considerations driving document storage decisions for businesses

Download now

Staying ahead of the game in the world of data

Create successful marketing campaigns by understanding your customers better

Download now

Transforming productivity

Solutions that facilitate work at full speed

Download now
Advertisement

Most Popular

Visit/business/business-operations/356395/nvidia-overtakes-intel-as-most-valuable-us-chipmaker
Business operations

Nvidia overtakes Intel as most valuable US chipmaker

9 Jul 2020
Visit/laptops/29190/how-to-find-ram-speed-size-and-type
Laptops

How to find RAM speed, size and type

24 Jun 2020
Visit/hardware/components/356405/is-it-time-to-put-intel-outside
components

Is it time to put Intel Outside?

10 Jul 2020