IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

Campaigners block use of facial recognition and ICE access to data

A long-running programme in San Diego has been halted amid privacy concerns and claims of bias

Activists have claimed victory in a long-running battle with US law enforcement to stop one of the most sophisticated and far-reaching facial recognition programmes.

A facial recognition system used by more than 30 agencies across San Diego, California will be suspended on 1 January 2020, according to a new agenda published by the San Diego Association of Governments (SANDAG). The technology is deployed by police officers in body-worn cameras and handheld devices. 

Campaigners with the Electronic Frontier Foundation (EFF) have also celebrated the fact that SANDAG had disabled US Immigration and Customs Enforcement (ICE) access to law enforcement databases and computer systems.

EFF had previously lobbied to restrict ICE access to law enforcement databases, especially after it was revealed that the immigration agents were using facial recognition technology.

San Diego’s programme was launched in 2012, and provided more than a thousand facial recognition devices including phones and tablets to numerous agencies. EFF claims that officers conducted more than 65,000 scans with these devices between 2016 and 2018.

Suspension of the campaign, however, means SANDAG will not renew its contract with the vendor, FaceFirst, when their agreement expires in March next year.

The campaigners’ opposition to facial recognition centre on the argument that historical biases are entrenched and exacerbated through applying the technology, particularly across ethnic minority communities.

Related Resource

Understanding the must-haves of modern data protection

Go beyond traditional backup and recovery

Download now

There have also been claims that the technology itself is not accurate, which can also leave systems open to abuse.

In New York, for instance, police officers had clumsily abused the technology to facilitate arrests when CCTV images were too unclear to identify the suspect in particular investigations.

The idea has been tested periodically in the UK over the last few years, with police forces across the country keen to take advantage of its benefits for law enforcement. 

The Information Commissioner’s Office (ICO) stepped into the debate earlier this year, however, to warn police forces that the data protection risks must be assessed. The ICO added that any software deployed must guarantee that racial biases are eliminated. 

Featured Resources

Accelerating AI modernisation with data infrastructure

Generate business value from your AI initiatives

Free Download

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Free Download

Modernise your legacy databases in the cloud

An introduction to cloud databases

Free Download

Powering through to innovation

IT agility drive digital transformation

Free Download

Most Popular

Former Uber security chief to face fraud charges over hack coverup
data breaches

Former Uber security chief to face fraud charges over hack coverup

29 Jun 2022
Macmillan Publishers hit by apparent cyber attack as systems are forced offline
Security

Macmillan Publishers hit by apparent cyber attack as systems are forced offline

30 Jun 2022
FCC commissioner urges Apple and Google to remove TikTok from app stores
data protection

FCC commissioner urges Apple and Google to remove TikTok from app stores

29 Jun 2022