Campaigners block use of facial recognition and ICE access to data

A long-running programme in San Diego has been halted amid privacy concerns and claims of bias

girl having her face scanned

Activists have claimed victory in a long-running battle with US law enforcement to stop one of the most sophisticated and far-reaching facial recognition programmes.

A facial recognition system used by more than 30 agencies across San Diego, California will be suspended on 1 January 2020, according to a new agenda published by the San Diego Association of Governments (SANDAG). The technology is deployed by police officers in body-worn cameras and handheld devices. 

Campaigners with the Electronic Frontier Foundation (EFF) have also celebrated the fact that SANDAG had disabled US Immigration and Customs Enforcement (ICE) access to law enforcement databases and computer systems.

EFF had previously lobbied to restrict ICE access to law enforcement databases, especially after it was revealed that the immigration agents were using facial recognition technology.

Advertisement
Advertisement - Article continues below
Advertisement - Article continues below

San Diego’s programme was launched in 2012, and provided more than a thousand facial recognition devices including phones and tablets to numerous agencies. EFF claims that officers conducted more than 65,000 scans with these devices between 2016 and 2018.

Suspension of the campaign, however, means SANDAG will not renew its contract with the vendor, FaceFirst, when their agreement expires in March next year.

The campaigners’ opposition to facial recognition centre on the argument that historical biases are entrenched and exacerbated through applying the technology, particularly across ethnic minority communities.

Related Resource

Understanding the must-haves of modern data protection

Go beyond traditional backup and recovery

Download now

There have also been claims that the technology itself is not accurate, which can also leave systems open to abuse.

In New York, for instance, police officers had clumsily abused the technology to facilitate arrests when CCTV images were too unclear to identify the suspect in particular investigations.

The idea has been tested periodically in the UK over the last few years, with police forces across the country keen to take advantage of its benefits for law enforcement. 

Advertisement - Article continues below

The Information Commissioner’s Office (ICO) stepped into the debate earlier this year, however, to warn police forces that the data protection risks must be assessed. The ICO added that any software deployed must guarantee that racial biases are eliminated. 

Featured Resources

Digitally perfecting the supply chain

How new technologies are being leveraged to transform the manufacturing supply chain

Download now

Three keys to maximise application migration and modernisation success

Harness the benefits that modernised applications can offer

Download now

Your enterprise cloud solutions guide

Infrastructure designed to meet your company's IT needs for next-generation cloud applications

Download now

The 3 approaches of Breach and Attack Simulation technologies

A guide to the nuances of BAS, helping you stay one step ahead of cyber criminals

Download now
Advertisement

Most Popular

Visit/microsoft-windows/32066/what-to-do-if-youre-still-running-windows-7
Microsoft Windows

What to do if you're still running Windows 7

14 Jan 2020
Visit/operating-systems/25802/17-windows-10-problems-and-how-to-fix-them
operating systems

17 Windows 10 problems - and how to fix them

13 Jan 2020
Visit/operating-systems/microsoft-windows/354526/memes-and-viking-funerals-the-internet-reacts-to-the
Microsoft Windows

Memes and Viking funerals: The internet reacts to the death of Windows 7

14 Jan 2020
Visit/web-browser/30394/what-is-http-error-503-and-how-do-you-fix-it
web browser

What is HTTP error 503 and how do you fix it?

7 Jan 2020