ICO neuters UK police use of facial recognition technology

Forces must assess data protection risks and eradicate racial bias from the software or face GDPR action

Man using facial recognition on phone

Live facial recognition (LFR) deployments by police forces across the UK have been dealt a blow after the UK data regulator confirmed its use falls under the umbrella of data protection.

In the midst of ongoing rows between law enforcement agencies and digital rights campaigners, the Information Commissioner's Office (ICO) has clarified LFR "is a potential threat to privacy" and a "high priority area".

Advertisement - Article continues below

The data regulator has advised police forces using facial recognition to carry out a full data protection impact assessment (DPIA), which must then be subsequently updated for each deployment. This is due to the sensitive nature of the processing, the volume of people affected, and the intrusion that can arise.

Forces must then submit these assessments to the ICO for consideration prior to any discussions between the two parties as to how the privacy risks can be mitigated. Any violations will be adjudicated under the General Data Protection Regulation (GDPR) and the Data Protection Act (DPA) 2018.

"We understand the purpose is to catch criminals," the Information Commissioner Elizabeth Denham said in a blog post. "But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives.

Advertisement
Advertisement - Article continues below

"I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR."

Advertisement - Article continues below

A small number of police forces, including the Met Police and South Wales Police, have been trialling facial recognition in public spaces for more than a year. But the effectiveness of the software has come under heavy scrutiny, particularly after research found the failure rate can be as high as 98%.

Several reports alluding to high inaccuracy rates, as well as concerns about legality, led the ICO to launch a probeinto ongoing trials towards the end of last year. This investigation is ongoing and is also pending the outcome of a legal challenge against South Wales Police made in May.

"Legitimate aims have been identified for the use of LFR," Denham continued, adding the ICO has learned a lot from its deep dive into how LFR works in practice.

"But there remain significant privacy and data protection issues that must be addressed, and I remain deeply concerned about the rollout of this technology."

Advertisement - Article continues below

The ICO argues LFR significantly differs to CCTV, and that facial recognition systems haven't yet resolved potential instances of racial bias; where more false positive matches are generated from certain ethnic groups.

In addition to full DPIAs, police forces must produce a bespoke 'appropriate policy document' to set out why, where, when and how LFR is being used. This is in addition to ensuring the algorithms within the software to not treat the race or sex of individuals unfairly.

This aspect of deployments, however, may be beyond the reach of individual police forces given the software powering the technology is developed by a third-party vendor.

Use of such technology in the criminal justice system, particularly artificial intelligence (AI)and facial recognition technology, has come under fire for bias and discrimination in the UK, and in the US.

For example, according to one report, the Met Police's counterparts in New York were abusing facial recognition technology to arrest people when CCTV images were too unclear to identify suspects.

In extreme cases, New York Police Department (NYPD) officers took high-resolution pictures of a suspect's celebrity doppelgnger to generate a match with another license database.

Advertisement
Advertisement

Recommended

Visit/security/cyber-security/355185/165-million-britons-experienced-a-cyber-crime-in-the-past-year
cyber security

Report: 16.5 million Britons fell victim to cyber crime in the past year

1 Apr 2020
Visit/cloud/amazon-web-services-aws/355183/aws-launches-amazon-detective
Amazon Web Services (AWS)

AWS launches Amazon Detective for investigating security incidents

1 Apr 2020
Visit/security/privacy/355182/government-to-launch-coronavirus-contact-tracking-app
privacy

UK government to launch coronavirus 'contact tracking' app

1 Apr 2020
Visit/software/video-conferencing/355180/zoom-does-not-use-end-to-end-encrypted
video conferencing

Zoom admits meetings don't use end-to-end encryption

1 Apr 2020

Most Popular

Visit/security/privacy/355211/google-releases-location-data-to-showcase-effectiveness-of-coronavirus
privacy

Google releases location data to show effectiveness of coronavirus lockdowns

3 Apr 2020
Visit/data-insights/data-management/355170/oracle-cloud-courses-are-free-during-coronavirus-lockdown
data management

Oracle cloud courses are free during coronavirus lockdown

31 Mar 2020
Visit/software/355113/companies-offering-free-software-to-fight-covid-19
Software

These are the companies offering free software during the coronavirus crisis

2 Apr 2020