ICO: Use of facial recognition tech in public spaces deeply concerning
ICO investigations found most companies using the technology tended to be "immature" in their approach to data protection
Information Commissioner Elizabeth Denham has described the use of live facial recognition (LFR) technology in public spaces as 'deeply concerning', and has called on companies not to deploy the technology just to 'save money'.
The statement comes after the Information Commissioner's Office (ICO) conducted six investigations into “planned or actual use of LFR in public places” for reasons ranging from public safety concerns to personalised advertising.
The regulator found that organisations planning to or already deploying the technology tended to be “immature” in their data protection “compliance considerations”.
“Our investigations found that controllers often gave insufficient consideration to the necessity, proportionality, and fairness of the use of LFR systems and failed to be sufficiently transparent,” Denham stated in an Information Commissioner’s Opinion notice.
This has caused Denham to become “deeply concerned about the potential for live facial recognition (LFR) technology to be used inappropriately, excessively or even recklessly”.
“When sensitive personal data is collected on a mass scale without people’s knowledge, choice or control, the impacts could be significant. We should be able to take our children to a leisure complex, visit a shopping centre or tour a city to see the sights without having our biometric data collected and analysed with every step we take,” she added.
The newly-published Information Commissioner's Opinion includes guidance for organisations considering implementing LFR into their practices, asking controllers to “not use LFR simply because it is available, it improves efficiency or saves money, or is part of a particular business model or proffered service”.
IT Pro 20/20: What the EU's new AI rules mean for business
The 17th issue of IT Pro 20/20 considers the effect of new regulations on the IT industryDOWNLOAD NOW
However, Denham did note that the technology “may” have some “benefits”.
“LFR has the potential to do significant good – helping in an emergency search for a missing child, for example,” she said.
Therefore, organisations should work on “building public trust and confidence” in the way the data is used, so that “the benefits derived from the technology can be fully realised”.
“Without trust, the benefits the technology may offer are lost,” Denham stated.
While the Opinion published today is mostly targeted towards private companies, it builds on the ICO’s 2019 Opinion into the use of LFR by police forces.
Ten months after the report was published, the use of facial recognition technology for law enforcement purposes was deemed unlawful, with the Court of Appeal declaring that the technology violates human rights, data protection laws, and equality laws.
The use of live facial recognition in public spaces also hit the headlines in 2019 when private developer Argent was found to have used the technology at its 67-acre Kings Cross Central Development site.
London Mayor Sadiq Khan questioned the need for the technology at the time, adding that public spaces "should be open for all to enjoy and use confidently and independently, avoiding separation or segregation".