MPs call for halt of live facial recognition trails
The Science and Technology Committee raised continued concerns over bias, privacy and inaccurate systems
MPs have called for a halt of live facial recognition (LFR) technology trials by law enforcement until regulations are in place that address issues such as bias and data retention.
The House of Commons Science and Technology Committee has said there should be no further trials of LFR, citing concerns over its accuracy and capacity to infringe upon the privacy of UK citizens.
"There is growing evidence from respected, independent bodies that the 'regulatory lacuna' surrounding the use of automatic facial recognition has called the legal basis of the trials into question," the Committee said in a report. "The Government, however, seems to not realise or to concede that there is a problem.
It added that these concerns were raised a year ago and that it's disappointed with the Home Office for its lack of action: "We reiterate our recommendation from our 2018 report that automatic facial recognition should not be deployed until concerns over the technology's effectiveness and potential bias have been fully resolved."
The Committee is now calling on the government to issue a moratorium on the current use of LFR and to suspend all trials until a legislative framework has been introduced. This should also include guidance on trial protocols and an evaluation system.
"The UK government should learn from the Scottish government's approach to biometrics and commission an independent review of options for the use and retention of biometric data that is not currently covered by the Protection of Freedoms Act 2012," the report said. "This process should culminate in legislation being brought forward that seeks to govern current and future biometric technologies."
Live, or automated, facial recognition is a system used with CCTV, where images captured by cameras of crowds or events are matched with faces stored in a database, through the use of machine learning-powered image processing. Possible matches are flagged and are then verified via a different method of identification. Matched images, regardless of accuracy can, therefore, be saved for a number of weeks.
This has been a point of contention, particularly with the Metropolitan Police, which deployed a trial at the Notting Hill Carnival that resulted in 98% inaccurate matches and the South Wales Police which had stored 2,400 images of misidentified people for a whole year. This is currently the subject of an on-going legal challenge brought to court by the civil rights group Liberty.
The Home Office, which awarded 2.6 million of funding for the South Wales police trials, has made its stance on LFR quite clear. Just over a week ago, the Home Secretary Sajid Javid backed police use of the technology despite the failings, but acknowledged the need for fixed regulations. His Office has suggested there is public support for the LFR to identify potential terrorists and people wanted for serious offences. However, earlier this month, the ICO waded into the argument by pointing out that LFR falls under data protection laws.
The data regulator advised that police forces using the technology must carry out a full data protection impact assessment (DPIA), which must then be subsequently updated for each deployment. Forces must then submit these assessments to the ICO for consideration prior to any discussions between the two parties as to how the privacy risks can be mitigated.
Although facial recognition has a clear divide in the country, with privacy campaigners, MPs, the ICO and the Home Office clashing over the subject, it is still open for discussion. In America however, the technology is facing exile. In May, the city of San Francisco banned its government agencies from using it and most recently, Orlando law enforcement has dropped Amazon's Rekognition platform because it did not have the infrastructure to support the system.