MIT research finds ethnic and gender bias in Amazon Rekognition

However, AWS said MIT's testing was "ill-advised" and its software not used in the way it was intended

Face being scanned by facial recognition tech

The Massachusetts Institute of Technology (MIT) has discovered that Amazon's Rekognition facial recognition platform may not be identifying race or gender accurately or fairly. A report by the scientific research facility said tests it has conducted on the technology found that Rekognition was less effective at identifying some races and genders compared to others.

For example, it mistakenly identified some pictures of women as men and this was more prevalent when presented with pictures of darker-skinned women. In fact, 31% of the time, it made this wrongful conclusion, compared to an error margin of 1.5% with Microsoft's alternative software.

However, AWS said that its software isn't true "facial recognition" software, but "facial analysis". It has been designed to identify facial expressions rather than ethnicity or gender and that's why it's less accurate than its competitors.

"[F]acial analysis [is] usually used to help search a catalog of photographs," Dr. Matt Wood, general manager of deep learning and AI at AWS said in a statement to VentureBeat. "[F]acial recognition is a distinct and different feature from facial analysis and attempts to match faces that appear similar. This is the same approach used to unlock some phones, or authenticate somebody entering a building, or by law enforcement to narrow the field when attempting to identify a person of interest."

He added that it is "ill-advised" to use the software in the way that MIT did because it has not been designed to identify criminals. However, in tests with its latest version of the software, AWS said it used data from parliamentary websites to test accuracy and had no false positive matches with the 99% confidence threshold.

However, it seems the company needs to communicate its software's purpose to its shareholders better as some have requested that the company stop selling its facial recognition service because they feel it violates human civil rights.

Featured Resources

Unleashing the power of AI initiatives with the right infrastructure

What key infrastructure requirements are needed to implement AI effectively?

Download now

Achieve today. Plan tomorrow. Making the hybrid multi-cloud journey

A Veritas webinar on implementing a hybrid multi-cloud strategy

Download now

A buyer’s guide for cloud-based phone solutions

Finding the right phone system for your modern business

Download now

The workers' experience report

How technology can spark motivation, enhance productivity and strengthen security

Download now

Most Popular

How to move Windows 10 from your old hard drive to SSD
operating systems

How to move Windows 10 from your old hard drive to SSD

21 Jan 2021
WhatsApp could face €50 million GDPR fine
General Data Protection Regulation (GDPR)

WhatsApp could face €50 million GDPR fine

25 Jan 2021
How to recover deleted emails in Gmail
email delivery

How to recover deleted emails in Gmail

6 Jan 2021