IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

Facial recognition tech used by UK police "breaches privacy"

Civil rights group Liberty has told the court of appeals that these systems are racially biased and contradict data protection laws

Automated facial recognition (AFR) systems will "radically" alter the way the UK is policed, a court of appeals has been told. 

Lawyers for civil rights group Liberty have argued that these systems are racially biased and contradict data protection laws, according to The Guardian

The legal challenge concerns the use of facial recognition by the South Wales Police to conduct mass surveillance. Despite concerns that the technology showed signs of racial bias and repeatedly produced inaccurate results, the force reportedly captured 500,000 images during public trials in 2018.   

In particular, Cardiff resident Ed Bridges was scanned while out Christmas shopping. In September 2019, a judicial review sided with the South Wales Police, but Liberty, which represents Bridges, has brought the case to the court of appeals. 

"If AFR is rolled out nationally, it will change radically the way that Britain is policed," Dan Squires QC said in a submission to the court. "Connected to a database with the right information, AFR could be used to identify very large numbers of people in a given place at a given time - for example, those present at a protest that the police are monitoring.

"Given the proliferation of databases operated by the police and other public authorities, the exponential increase in information held by public bodies and the ever-increasing practice of sharing that information between public bodies, it is not difficult to imagine that police forces nationally could soon - if they cannot already - have access to photographs of the vast majority of the population."

The legal challenge comes just weeks after a number of tech companies announced plans to either ditch or suspend facial recognition services in light of the black lives matter protests

At the same time, a collective of more than 1,000 researchers, academics and experts in the field of artificial intelligence (AI) have penned an open letter to protest a research paper in Springer that claims to use neural networks to "predict criminality". 

"This upcoming publication warrants a collective response because it is emblematic of a larger body of computational research that claims to identify or predict "criminality" using biometric and/or criminal legal data," the open letter read.

"Such claims are based on unsound scientific premises, research, and methods, which numerous studies spanning our respective disciplines have debunked over the years."

Featured Resources

Accelerating AI modernisation with data infrastructure

Generate business value from your AI initiatives

Free Download

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Free Download

Modernise your legacy databases in the cloud

An introduction to cloud databases

Free Download

Powering through to innovation

IT agility drive digital transformation

Free Download

Most Popular

Former Uber security chief to face fraud charges over hack coverup
data breaches

Former Uber security chief to face fraud charges over hack coverup

29 Jun 2022
Macmillan Publishers hit by apparent cyber attack as systems are forced offline
Security

Macmillan Publishers hit by apparent cyber attack as systems are forced offline

30 Jun 2022
FCC commissioner urges Apple and Google to remove TikTok from app stores
data protection

FCC commissioner urges Apple and Google to remove TikTok from app stores

29 Jun 2022