Rank One Computing to address misuse of its facial recognition technology

Rank One Computing responds to the wrongful arrest of Michigan resident Robert Williams

Rank One Computing, a provider of facial recognition services, says it will take action to address the misuse of its technology by law enforcement officials.

The company is doing so after its software was used in the first known wrongful arrest based on facial recognition technology in the US.

In January 2019, Michigan resident Robert Williams was arrested for shoplifting in downtown Detroit. Williams, who is black, was arrested after law enforcement connected him to the crime based on a facial-recognition search that observed similarities between surveillance footage of the theft and Williams’ driver’s license photo. 

While being interrogated by law enforcement officials, an officer pointed to the surveillance image used to arrest Williams and asked if the man in the photo was him. Williams stated it wasn’t.

Though an officer admitted, “The computer must have gotten it wrong,” Robert was held for several more hours before being released later that evening. According to Wayne County prosecutor Kym Worthy, police also lacked the corroborating evidence needed to arrest Williams for the crime.

Worthy added in a statement, “This case should not have been issued based on the (police) investigation, and for that we apologize. This does not in any way make up for the hours that Mr. Williams spent in jail.”

Rank One Computing chief executive Brendan Klare shared in an email to Reuters that as a result of the wrongful arrest, the company “will add a legal means to revoke any use of our software that violates our Code of Ethics, and conduct a technical review of additional safeguards we can incorporate into our software to prevent any potential for misuse.”

The ACLU of Michigan has since announced it’ll lodge a complaint against Detroit police on Williams’ behalf too. Per the ACLU: “Study after study has confirmed that face recognition technology is flawed and biased, with significantly higher error rates when used against people of color and women.”

Law enforcement’s use of facial recognition technology has come under fire as of late. Microsoft and Amazon recently announced they would no longer sell facial-recognition technology to police. The decision followed nationwide protests demanding an end to law enforcement tactics that unfairly target black people in the US.

Featured Resources

Managing security risk and compliance in a challenging landscape

How key technology partners grow with your organisation

Download now

Evaluate your order-to-cash process

15 recommended metrics to benchmark your O2C operations

Download now

AI 360: Hold, fold, or double down?

How AI can benefit your business

Download now

Getting started with Azure Red Hat OpenShift

A developer’s guide to improving application building and deployment capabilities

Download now

Most Popular

SolarWinds hackers hit Malwarebytes through Microsoft exploit
hacking

SolarWinds hackers hit Malwarebytes through Microsoft exploit

20 Jan 2021
How to recover deleted emails in Gmail
email delivery

How to recover deleted emails in Gmail

6 Jan 2021
What is a 502 bad gateway and how do you fix it?
web hosting

What is a 502 bad gateway and how do you fix it?

12 Jan 2021