Sadiq Khan concerned over facial recognition at King's Cross

The mayor of London has written to the developer calling for clearer regulation of the technology

Facial recognition

The mayor of London has written a letter to the King's Cross Central development firm questioning its use of facial recognition technology.

Sadiq Khan's correspondence to Argent, the developer in charge of the 67-acre site, asked for reassurances that the technology was being used legally. The mayor has also called for new laws to outline how the facial recognition can be used.

On Tuesday, Argent said that its use of the facial recognition was to "ensure public safety", but the company hasn't clarified how long the technology has been in operation, the legal basis of its use, or what system it has in place to protect the data it collects.

"I've written to the CEO of the King's Cross Development to raise my concerns about the use of facial recognition technology across the site," he tweeted. "All of London's public spaces should be open for all to enjoy and use confidently and independently, avoiding separation or segregation."

In a letter seen by the Guardian, the mayor requests more information about how exactly the technology is being used and for reassurance that the company is liaising with government ministers and the Information Commissioner's Office (ICO) to ensure its use is fully compliant with the law.

Advertisement
Advertisement - Article continues below

The ICO said on Tuesday that it is investigating the use of facial recognition technology in public spaces, focusing on law enforcement and private sector organisations.

Like Khan, the UK's data watchdog has also called for stronger guidelines into the use of facial recognition and its statement was also echoed by MP David Davis.

"This is yet another demonstration of the pressing need to regulate facial recognition tech," he tweeted. "The use of this by private companies, with no clear oversight or accountability, is a serious intrusion on the privacy of citizens going about their everyday lives."

13/08/2019: ICO to investigate King's Cross facial recognition use

The Information Commissioner's Office (ICO) is investigating the use of automatic facial recognition technology as concerns grow over its potentially illegal use at the King's Cross development site in London.

The UK's data regulator warned that businesses using the surveillance technology needed to demonstrate its use was "strictly necessary and proportionate" and had a legal basis.

On Monday, the owners of the King's Cross site defended the use of the tech saying it was "in the interest of public safety and to ensure that everyone who visits has the best possible experience."

The 67-acre site houses 50 buildings, including tech giants such as Google, but ironically it's the landowner admitting to deploying the controversial surveillance technology, rather than the tech firms. 

The ICO said it is currently looking at the use of the technology by law enforcement in public spaces and by private sector organisations, including where they are partnering with police forces. The regulator said it will consider taking action where it finds non-compliance with the law.

"Since new data protection laws came into effect on 25 May 2018 there are extra protections for people," an ICO spokesperson said. "These require organisations to assess and reduce the privacy risks of using new and intrusive surveillance technologies like automatic facial recognition.

Advertisement
Advertisement - Article continues below

"Organisations wishing to automatically capture and use images of individuals going about their business in public spaces need to provide clear evidence to demonstrate it is strictly necessary and proportionate for the circumstances and that there is a legal basis for that use.

Concerns were also raised by the surveillance camera commissioner, Tony Porter, who called on ministers to introduce robust and transparent legislation to protect the rights and privacy of the general public.

While its ethical and legal use is under review, the actual accuracy of the software has also been cited by privacy campaigners. Last year, the met police came heavy criticism for deploying the technology at the Notting Hill Carnival and coming away with zero arrests and a 98% inaccuracy rate. Despite the lack of success, the Home Office continued with the tests and is even considering its use to find missing persons.

Featured Resources

The IT Pro guide to Windows 10 migration

Everything you need to know for a successful transition

Download now

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Software-defined storage for dummies

Control storage costs, eliminate storage bottlenecks and solve storage management challenges

Download now

6 best practices for escaping ransomware

A complete guide to tackling ransomware attacks

Download now
Advertisement

Most Popular

Visit/security/identity-and-access-management-iam/354289/44-million-microsoft-customers-found-using
identity and access management (IAM)

44 million Microsoft customers found using compromised passwords

6 Dec 2019
Visit/cloud/microsoft-azure/354230/microsoft-not-amazon-is-going-to-win-the-cloud-wars
Microsoft Azure

Microsoft, not Amazon, is going to win the cloud wars

30 Nov 2019
Visit/hardware/354237/five-signs-that-its-time-to-retire-it-kit
Sponsored

Five signs that it’s time to retire IT kit

29 Nov 2019
Visit/mobile/5g/354286/why-5g-could-be-a-cyber-security-nightmare
5G

Why 5G could be a cyber security nightmare

6 Dec 2019