Met Police rolls out controversial facial recognition tech in London

Today will see eight-hour trials take place in the British capital

London's Metropolitan Police is testing its facial recognition technology in the capital today. This week's trials mark the seventh attempt launched by the Met in a bid to get its controversial technology to take flight.

The testing took place yesterday and continues today over a period of eight hours each day.

Rather than scanning people without their consent, the Met is inviting people to take part in the testing on a voluntary basis. Critics of the scheme have been vocal, suggesting that people who turn down the invites will look "suspicious".

Bigger grievances, meanwhile, pertain to issues of privacy. Big Brother Watch, a prominent civil liberties advocacy group, has denounced the technology as "authoritarian, dangerous and lawless," stating that it constitutes a "breach of fundamental rights to privacy and freedom of speech and assembly".

If you're keen to avoid being roped into the testing period, you'll want to avoid central London. The areas where the tests are being conducted fall within the touristic hubs of the capital, with Soho, Picadilly Circus and Leicester Square being touted as prime locations.

Be wary of pranksters, too; the Met has established that "clear uniformed" officers only would be carrying out trials while maintaining that anyone requesting not to be scanned would not be black marked as "suspicious".

For its part, the Met is keen to forge ahead with trials. Information Commissioner Elizabeth Denham has sung the facial recognition system's praises, positing in a blog post that it could bring "significant public safety benefits". The system, if approved for use, would mean the software could be employed to identify people wanted by the police with technology doing most of the heavy lifting.

However, previous trials have shown something of a mixed record. And that's us being diplomatic; a trial back in July led to a grand total of 0 arrests probably a good thing, since a May study found the software flagged false positives in up to 98% of cases.

In the meantime, Big Brother Watch is ploughing on in its role as a watchdog; it recently took to Twitter to name, shame and snap one of the facial recognition vans conducting the trials today. "This looks an awful lot like covert surveillance," the group commented on a picture of the seemingly innocuous plain green van.

This is one of the facial recognition vans out in London today at Cambridge Circus. This looks an awful lot like covert surveillance. Also plain clothed officers hanging around. pic.twitter.com/JNGINzvR6A

Big Brother Watch (@bbw1984) December 17, 2018

Featured Resources

Defeating ransomware with unified security from WatchGuard

How SMBs can defend against the onslaught of ransomware attacks

Free download

The IT expert’s guide to AI and content management

How artificial intelligence and machine learning could be critical to your business

Free download

The path to CX excellence

Four stages to thrive in the experience economy

Free download

Becoming an experience-based business

Your blueprint for a strong digital foundation

Free download

Recommended

Nokia's Digital Automation Cloud will power WEG's Industry 4.0 project
automation

Nokia's Digital Automation Cloud will power WEG's Industry 4.0 project

26 Apr 2021

Most Popular

Zoom: From pandemic upstart to hybrid work giant
video conferencing

Zoom: From pandemic upstart to hybrid work giant

14 Sep 2021
What are the pros and cons of AI?
machine learning

What are the pros and cons of AI?

8 Sep 2021
Hackers develop Linux port of Cobalt Strike for new attacks
Security

Hackers develop Linux port of Cobalt Strike for new attacks

14 Sep 2021