Amazon halts police use of its facial recognition tech
Company hopes that one-year moratorium will be enough for Congress to introduce additional legislation
Amazon is to pause the police use of its AWS facial recognition software Rekognition following two weeks of intense protests against police brutality and the murder of George Floyd on 25 May.
The tech giant announced that it would be implementing a one-year moratorium on the technology, making an exception for organisations such as the International Center for Missing and Exploited Children to use Rekognition in helping rescue human trafficking victims.
The use of facial recognition software by police forces has been long marred with allegations of racial bias, which many human rights organisations argued has a disproportionately adverse effect on non-white people. Amazon Rekognition, in particular, has been criticised for struggling to identify the gender of individuals with darker skin.
In a statement on Amazon’s blog, the company said that “governments should put in place stronger regulations to govern the ethical use of facial recognition technology. We hope this one-year moratorium might give Congress enough time to implement appropriate rules, and we stand ready to help if requested,” it added.
It's currently unclear how many police forces use Amazon's facial recognition technology, although Rekognition marketing does make it clear that it works well with body-worn cameras.
Microsoft has yet to ban police use of its facial recognition technology, however, speaking to IT Pro earlier in the year, the company revealed that it had rejected applications from US police forces in the past.
"A police force in the United States came to us and they use body cams, which is considered to be a very good way to create accountability and transparency," explained John Frank, vice president of EU Government Affairs at Microsoft, at a press event in March. "We came to the conclusion that the cameras, and the variable situations in which they’d be used, would not produce good enough images to be able to have an accurate system. And we know that in the United States there's over-policing of racial minorities and that racial minorities would be most at risk of having a false positive from the system."
IT Pro 20/20: How regulation is shaping innovation
The fifth issue of IT Pro 20/20 looks at how new rules are forcing companies to change the way they do businessDownload now
"I do think that police forces want to do the right thing," he added. "But they need to understand [the technology]. This goes back to the disclosures that the companies make about what it's designed for and what it's capable of doing."
IBM had decided to scrap its general-purpose facial recognition and analysis software suite earlier this week, citing similar ethical concerns. The U-turn decision came after years of effort put into developing the AI-powered tools, but the decision was made due to fears that the tech could be used for purposes that go against IBM’s principles of trust and transparency.
“We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies,” CEO Arvind Krishna outlined in a letter to the US Congress.
Nicole Ozer, technology and civil liberties director of the American Civil Liberties Union of Northern California, welcomed Amazon’s decision.
“Face recognition technology gives governments the unprecedented power to spy on us. We urge Microsoft and other companies to join IBM, Google, and Amazon in moving towards the right side of history,” Ozer said in a statement.
Preparing for AI-enabled cyber attacks
MIT technology review insightsDownload now
Cloud storage performance analysis
Storage performance and value of the IONOS cloud Compute EngineDownload now
The Forrester Wave: Top security analytics platforms
The 11 providers that matter most and how they stack upDownload now
Harness data to reinvent your organisation
Build a data strategy for the next wave of cloud innovationDownload now