MIT research finds ethnic and gender bias in Amazon Rekognition

However, AWS said MIT's testing was "ill-advised" and its software not used in the way it was intended

Face being scanned by facial recognition tech

The Massachusetts Institute of Technology (MIT) has discovered that Amazon's Rekognition facial recognition platform may not be identifying race or gender accurately or fairly. A report by the scientific research facility said tests it has conducted on the technology found that Rekognition was less effective at identifying some races and genders compared to others.

For example, it mistakenly identified some pictures of women as men and this was more prevalent when presented with pictures of darker-skinned women. In fact, 31% of the time, it made this wrongful conclusion, compared to an error margin of 1.5% with Microsoft's alternative software.

However, AWS said that its software isn't true "facial recognition" software, but "facial analysis". It has been designed to identify facial expressions rather than ethnicity or gender and that's why it's less accurate than its competitors.

"[F]acial analysis [is] usually used to help search a catalog of photographs," Dr. Matt Wood, general manager of deep learning and AI at AWS said in a statement to VentureBeat. "[F]acial recognition is a distinct and different feature from facial analysis and attempts to match faces that appear similar. This is the same approach used to unlock some phones, or authenticate somebody entering a building, or by law enforcement to narrow the field when attempting to identify a person of interest."

He added that it is "ill-advised" to use the software in the way that MIT did because it has not been designed to identify criminals. However, in tests with its latest version of the software, AWS said it used data from parliamentary websites to test accuracy and had no false positive matches with the 99% confidence threshold.

However, it seems the company needs to communicate its software's purpose to its shareholders better as some have requested that the company stop selling its facial recognition service because they feel it violates human civil rights.

Featured Resources

BIOS security: The next frontier for endpoint protection

Today’s threats upend traditional security measures

Download now

The role of modern storage in a multi-cloud future

Research exploring the impact of modern storage in defining cloud success

Download now

Enterprise data protection: A four-step plan

An interactive buyers’ guide and checklist

Download now

The total economic impact of Adobe Sign

Cost savings and business benefits enabled by Adobe Sign

Download now

Most Popular

16 ways to speed up your laptop
Laptops

16 ways to speed up your laptop

16 Sep 2020
16 ways to speed up your laptop
Laptops

16 ways to speed up your laptop

16 Sep 2020
The Xbox Series X shows how far the cloud still has to go
Cloud

The Xbox Series X shows how far the cloud still has to go

25 Sep 2020