Met Police rolls out controversial facial recognition tech in London

Today will see eight-hour trials take place in the British capital

London's Metropolitan Police is testing its facial recognition technology in the capital today. This week's trials mark the seventh attempt launched by the Met in a bid to get its controversial technology to take flight.

The testing took place yesterday and continues today over a period of eight hours each day.

Rather than scanning people without their consent, the Met is inviting people to take part in the testing on a voluntary basis. Critics of the scheme have been vocal, suggesting that people who turn down the invites will look "suspicious".

Bigger grievances, meanwhile, pertain to issues of privacy. Big Brother Watch, a prominent civil liberties advocacy group, has denounced the technology as "authoritarian, dangerous and lawless," stating that it constitutes a "breach of fundamental rights to privacy and freedom of speech and assembly".

If you're keen to avoid being roped into the testing period, you'll want to avoid central London. The areas where the tests are being conducted fall within the touristic hubs of the capital, with Soho, Picadilly Circus and Leicester Square being touted as prime locations.

Be wary of pranksters, too; the Met has established that "clear uniformed" officers only would be carrying out trials while maintaining that anyone requesting not to be scanned would not be black marked as "suspicious".

For its part, the Met is keen to forge ahead with trials. Information Commissioner Elizabeth Denham has sung the facial recognition system's praises, positing in a blog post that it could bring "significant public safety benefits". The system, if approved for use, would mean the software could be employed to identify people wanted by the police with technology doing most of the heavy lifting.

However, previous trials have shown something of a mixed record. And that's us being diplomatic; a trial back in July led to a grand total of 0 arrests probably a good thing, since a May study found the software flagged false positives in up to 98% of cases.

In the meantime, Big Brother Watch is ploughing on in its role as a watchdog; it recently took to Twitter to name, shame and snap one of the facial recognition vans conducting the trials today. "This looks an awful lot like covert surveillance," the group commented on a picture of the seemingly innocuous plain green van.

This is one of the facial recognition vans out in London today at Cambridge Circus. This looks an awful lot like covert surveillance. Also plain clothed officers hanging around. pic.twitter.com/JNGINzvR6A

Big Brother Watch (@bbw1984) December 17, 2018

Featured Resources

How to choose an AI vendor

Five key things to look for in an AI vendor

Download now

The UK 2020 Databerg report

Cloud adoption trends in the UK and recommendations for cloud migration

Download now

2021 state of email security report: Ransomware on the rise

Securing the enterprise in the COVID world

Download now

The impact of AWS in the UK

How AWS is powering Britain's fastest-growing companies

Download now

Recommended

Nokia's Digital Automation Cloud will power WEG's Industry 4.0 project
automation

Nokia's Digital Automation Cloud will power WEG's Industry 4.0 project

26 Apr 2021

Most Popular

How to find RAM speed, size and type
Laptops

How to find RAM speed, size and type

16 Jun 2021
Q&A: Enabling transformation
Sponsored

Q&A: Enabling transformation

10 Jun 2021
Millions of Volkswagen customers affected by data breach
data breaches

Millions of Volkswagen customers affected by data breach

14 Jun 2021