Campaigners block use of facial recognition and ICE access to data

A long-running programme in San Diego has been halted amid privacy concerns and claims of bias

girl having her face scanned

Activists have claimed victory in a long-running battle with US law enforcement to stop one of the most sophisticated and far-reaching facial recognition programmes.

A facial recognition system used by more than 30 agencies across San Diego, California will be suspended on 1 January 2020, according to a new agenda published by the San Diego Association of Governments (SANDAG). The technology is deployed by police officers in body-worn cameras and handheld devices. 

Campaigners with the Electronic Frontier Foundation (EFF) have also celebrated the fact that SANDAG had disabled US Immigration and Customs Enforcement (ICE) access to law enforcement databases and computer systems.

EFF had previously lobbied to restrict ICE access to law enforcement databases, especially after it was revealed that the immigration agents were using facial recognition technology.

San Diego’s programme was launched in 2012, and provided more than a thousand facial recognition devices including phones and tablets to numerous agencies. EFF claims that officers conducted more than 65,000 scans with these devices between 2016 and 2018.

Suspension of the campaign, however, means SANDAG will not renew its contract with the vendor, FaceFirst, when their agreement expires in March next year.

The campaigners’ opposition to facial recognition centre on the argument that historical biases are entrenched and exacerbated through applying the technology, particularly across ethnic minority communities.

Related Resource

Understanding the must-haves of modern data protection

Go beyond traditional backup and recovery

Download now

There have also been claims that the technology itself is not accurate, which can also leave systems open to abuse.

In New York, for instance, police officers had clumsily abused the technology to facilitate arrests when CCTV images were too unclear to identify the suspect in particular investigations.

The idea has been tested periodically in the UK over the last few years, with police forces across the country keen to take advantage of its benefits for law enforcement. 

The Information Commissioner’s Office (ICO) stepped into the debate earlier this year, however, to warn police forces that the data protection risks must be assessed. The ICO added that any software deployed must guarantee that racial biases are eliminated. 

Featured Resources

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Evaluate your order-to-cash process

15 recommended metrics to benchmark your O2C operations

Download now

AI 360: Hold, fold, or double down?

How AI can benefit your business

Download now

Getting started with Azure Red Hat OpenShift

A developer’s guide to improving application building and deployment capabilities

Download now

Most Popular

School laptops sent by government arrive loaded with malware
malware

School laptops sent by government arrive loaded with malware

21 Jan 2021
How to move Windows 10 from your old hard drive to SSD
operating systems

How to move Windows 10 from your old hard drive to SSD

21 Jan 2021
What is the Raspberry Pi Pico?
Hardware

What is the Raspberry Pi Pico?

21 Jan 2021