UK police warn of bias in AI tools

Legal pitfalls and the potential to point resources in the wrong area make law enforcement wary of the tech

Police officer

UK police have said that a lack of consistent guidelines for automated technology has them wary of its potential to "amplify" prejudices, according to a report by the Centre for Data Ethics and Innovation (CDEI).

Data analytics can recommend police actions like stop and search, which could exacerbate biases, the report found. Machine-learning tools that use existing records to populate their datasets may derive prejudices from the arresting officers, teaching them to suspect certain demographics more frequently than others.

Software can also predict crime hot-spots and advise police to send officers, leading to more arrests but at the cost of policing in other locations

"We pile loads of resources into a certain area and it becomes a self-fulfilling prophecy, purely because there's more policing going into that area," one anonymous interviewee said.

Advertisement
Advertisement - Article continues below

The CDEI urged police forces to consider how algorithmic bias may impact their allocation of resources in this way.

The study also warned that algorithmic fairness is not merely a matter of data bias, but demands to be considered within a broader "operational, organisational, and legal context" to prevent analytics from dominating decision-making.

"Officers often disagree with the algorithm," said one of the study's participants, "The point where you don't get that challenge, that's when people are putting that professional judgement aside."

Without such professional judgement, the CDEI anticipates a spike in discrimination cases brought by those who were scored "negatively" by automated policing tools.

"Given the lack of any government policy for police use of data analytics, it means that police forces are going to be reluctant to innovate," said Alexander Babuta, research fellow at the Royal United Service Institute.

Andy Davies, consultant for the Police and Intelligence Services at SAS UK and Ireland, added: "[AI] should not be used as a standalone solution. Rather, this new technology should complement the work of the emergency services by providing them with the necessary insights that they need to make informed, and potentially life-saving decisions."

Earlier this year, the ICO's executive director for technology policy and innovation, Simon McDougall, proposed that GDPR principles be applied to AI, including the right to object to profiling and the right to dispute decisions made exclusively by a machine.

The Royal United Service Institute will publish a final report for the CDEI in early 2020, which will include a Code of Practice for the use of artificial intelligence in the police force.

Featured Resources

The IT Pro guide to Windows 10 migration

Everything you need to know for a successful transition

Download now

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Software-defined storage for dummies

Control storage costs, eliminate storage bottlenecks and solve storage management challenges

Download now

6 best practices for escaping ransomware

A complete guide to tackling ransomware attacks

Download now
Advertisement

Most Popular

Visit/security/identity-and-access-management-iam/354289/44-million-microsoft-customers-found-using
identity and access management (IAM)

44 million Microsoft customers found using compromised passwords

6 Dec 2019
Visit/cloud/microsoft-azure/354230/microsoft-not-amazon-is-going-to-win-the-cloud-wars
Microsoft Azure

Microsoft, not Amazon, is going to win the cloud wars

30 Nov 2019
Visit/hardware/354237/five-signs-that-its-time-to-retire-it-kit
Sponsored

Five signs that it’s time to retire IT kit

29 Nov 2019
Visit/business/business-strategy/354195/where-modernisation-and-sustainability-meet-a-tale-of-two
Sponsored

Where modernisation and sustainability meet: A tale of two benefits

25 Nov 2019