ICO tells police to curb their use of facial recognition tech
Data commissioner says authorities need to 'stop and take a breath'
UK police forces need to curb their use of facial recognition technology until a code of practice can be established, the Information Commissioner has said.
The UK's data regulator has called for the government to introduce new laws to mitigate the risk that the technology presents, arguing that current rules do little to address ethical and legal issues associated with live facial recognition.
The recommendations follow an ICO investigation in December 2018 sparked by public concern over the various live facial recognition trials around the country.
Information Commissioner Elizabeth Denham said that the investigation raised serious concerns about the use of a technology that relies on huge amounts of sensitive personal information.
"It is right that our police forces should explore how new techniques can help keep us safe," Denham said in a blog post. "But from a regulator's perspective, I must ensure that everyone working in this developing area stops to take a breath and works to satisfy the full rigour of UK data protection law.
"Moving too quickly to deploy technologies that can be overly invasive in people's lawful daily lives risks damaging trust not only in the technology but in the fundamental model of policing by consent. We must all work together to protect and enhance that consensus."
Trends in modern data protection
A comprehensive view of the data protection landscapeDownload now
The revelation that facial recognition technology was being used at London's King's Cross Station earlier this year raised serious concerns about data privacy and regulation, particularly as private developers were able to install the technology without notice.
The ICO said it found the current combination of laws, codes and practices relating to live facial recognition to be inadequate for driving the ethical and legal approach that's needed to truly manage the risk that the technology presents. It also said that the technology would increase the likelihood of legal failures without a statutory code and would ultimately undermine public confidence.
The ICO argued that a statutory code of practice was necessary to give the police and the public enough knowledge as to when and how the police can use live facial recognition systems in public spaces.
The ICO said it will be liaising with the Home Office, the Investigatory Powers Commissioner, the Biometrics Commissioner, the Surveillance Camera Commissioner and policing bodies on how to progress its recommendation for a statutory code of practice.
"The government supports the police as they trial new technologies to protect the public, including facial recognition, which has helped them identify, locate and arrest suspects that wouldn't otherwise have been possible," a Home Office spokesperson told IT Pro.
"The High Court recently found there is a clear and sufficient legal framework for the use of live facial recognition technology. We are always willing to consider proposals to improve the legal framework and promote public safety and confidence in the police."
03/12/2018: The ICO ready to take on the police over facial recognition
The information watchdog has launched an investigation into the UK police's use of facial recognition technology (FRT), which has been trialled around the country.
The Information Commissioner Elizabeth Denham has opened an inquiry into the use of the technology after expressing concern over its legality and effectiveness, according to The Daily Telegraph.
Earlier in the year a report called 'Face Off: The lawless growth of facial recognition technology' suggested London's Met police had a failure rate of 98% when trialling the technology at the Notting Hill Carnival and actually misidentified 95 innocent people.
The same force admitted that it had stored 102 innocent people's biometric data for 30 days, but still went on to plan and use the technology throughout the year. However, it was later revealed that the Met has not made a single arrest as a result of its use of facial recognition.
The South Wales police also used the technology with poor results, misidentifying 2,400 innocent people and storing the data for a year, without their knowledge.
Multiple human rights campaign groups have voiced their concerns about the accuracy of the technology used by law enforcement, such as Big Brother Watch, which presented a report to Parliament in May.
About the same time, Denham wrote a blog post on the ICO website expressing concerns about the legality of the police's use of facial recognition software.
"How does the use of FRT in this way comply with the law?" she wrote. "How effective and accurate is the technology? How do forces guard against bias? What protections are there for people that are of no interest to the police? How do the systems guard against false positives and the negative impact of them?"
Denham said she had set out these concerns in letters to the Home Office and the NPCC and said she would consider legal action if her concerns were not addressed.
The IT Pro guide to Windows 10 migration
Everything you need to know for a successful transitionDownload now
Managing security risk and compliance in a challenging landscape
How key technology partners grow with your organisationDownload now
Software-defined storage for dummies
Control storage costs, eliminate storage bottlenecks and solve storage management challengesDownload now
6 best practices for escaping ransomware
A complete guide to tackling ransomware attacksDownload now