UK police machine learning trials go unregulated

Report says there is a lack of clear guidance on the use of algorithms

Police on patrol

The use of machine learning by UK police forces to support decision-making is in its infancy and there is a lack of research examining how the use of an algorithm influences officers' decision-making in practice.

That is according to the defence think tank, the Royal United Services Institute, which noted there is a limited evidence base on the efficacy and efficiency of different systems, their cost-effectiveness, their impact on individual rights and the extent to which they serve valid policing aims.

The report, titled "Machine Learning Algorithms and Police Decision-Making Legal, Ethical and Regulatory Challenges", said that there is a lack of clear guidance and codes of practice outlining appropriate constraints governing how police forces should trial predictive algorithmic tools.

"This should be addressed as a matter of urgency to enable police forces to trial new technologies in accordance with data protection legislation, respect for human rights and administrative law principles," the report's authors said.

It added that while machine learning algorithms are currently being used for limited policing purposes, there is potential for the technology to do much more, and the lack of a regulatory and governance framework for its use is concerning.

"A new regulatory framework is needed, one which establishes minimum standards around issues such as transparency and intelligibility, the potential effects of the incorporation of an algorithm into a decision-making process, and relevant ethical issues," the report said.

It urged the creation of a formalised system of scrutiny and oversight, including an inspection role for Her Majesty's Inspectorate of Constabulary and Fire and Rescue Services, is necessary to ensure adherence to this new framework.

There are also issues over procurement of such systems. The report said that such procurements should "explicitly require that it be possible to retroactively deconstruct the algorithm in order to assess which factors influenced the model's predictions", along with a requirement for the supplier to be able to provide "an expert witness who can provide details concerning the algorithm's operation if needed, for instance in an evidential context".

The report also called for a collaborative, multidisciplinary approach, needed to address the complex issues raised by the use of machine learning algorithms for decision-making.

"At the national level, a working group consisting of members from the fields of policing, computer science, law and ethics should be tasked with sharing real-world' innovations and challenges, examining operational requirements for new algorithms within policing, with a view to setting out the relevant parameters and requirements, and considering the appropriate selection of training and test data," the report said.

It added that it is essential that the officers using machine learning technology are sufficiently trained to do so in a fair and responsible way and "are able to act upon algorithmic predictions in a way that maintains their discretion and professional judgement".

Featured Resources

Five lessons learned from the pivot to a distributed workforce

Delivering continuity and scale with a remote work strategy

Download now

Connected experiences in a digital transformation

Enable businesses to meet the demands of the future

Download now

Simplify to secure

Reduce complexity by integrating your security ecosystem

Download now

Enhance the safety and security of your people, assets and operations

Enable a true vision of security with an engineered solution based on hyperconverged and storage platforms

Download now

Recommended

MarqVision detects counterfeit products with deep learning and AI
intellectual property

MarqVision detects counterfeit products with deep learning and AI

18 Sep 2020
The IT Pro Podcast: Attack of the AI hackers
artificial intelligence (AI)

The IT Pro Podcast: Attack of the AI hackers

14 Aug 2020
MIT develops AI tech to edit outdated Wikipedia articles
artificial intelligence (AI)

MIT develops AI tech to edit outdated Wikipedia articles

13 Feb 2020

Most Popular

Accenture ploughs $3 billion into cloud migration support group
digital transformation

Accenture ploughs $3 billion into cloud migration support group

17 Sep 2020
16 ways to speed up your laptop
Laptops

16 ways to speed up your laptop

16 Sep 2020
Google takes on Zoom with launch of Meet hardware
video conferencing

Google takes on Zoom with launch of Meet hardware

16 Sep 2020