MIT research finds ethnic and gender bias in Amazon Rekognition

However, AWS said MIT's testing was "ill-advised" and its software not used in the way it was intended

Face being scanned by facial recognition tech

The Massachusetts Institute of Technology (MIT) has discovered that Amazon's Rekognition facial recognition platform may not be identifying race or gender accurately or fairly. A report by the scientific research facility said tests it has conducted on the technology found that Rekognition was less effective at identifying some races and genders compared to others.

For example, it mistakenly identified some pictures of women as men and this was more prevalent when presented with pictures of darker-skinned women. In fact, 31% of the time, it made this wrongful conclusion, compared to an error margin of 1.5% with Microsoft's alternative software.

However, AWS said that its software isn't true "facial recognition" software, but "facial analysis". It has been designed to identify facial expressions rather than ethnicity or gender and that's why it's less accurate than its competitors.

"[F]acial analysis [is] usually used to help search a catalog of photographs," Dr. Matt Wood, general manager of deep learning and AI at AWS said in a statement to VentureBeat. "[F]acial recognition is a distinct and different feature from facial analysis and attempts to match faces that appear similar. This is the same approach used to unlock some phones, or authenticate somebody entering a building, or by law enforcement to narrow the field when attempting to identify a person of interest."

He added that it is "ill-advised" to use the software in the way that MIT did because it has not been designed to identify criminals. However, in tests with its latest version of the software, AWS said it used data from parliamentary websites to test accuracy and had no false positive matches with the 99% confidence threshold.

However, it seems the company needs to communicate its software's purpose to its shareholders better as some have requested that the company stop selling its facial recognition service because they feel it violates human civil rights.

Featured Resources

Choosing a collaboration platform

Eight questions every IT leader should ask

Download now

Performance benchmark: PostgreSQL/ MongoDB

Helping developers choose a database

Download now

Customer service vs. customer experience

Three-step guide to modern customer experience

Download now

Taking a proactive approach to cyber security

A complete guide to penetration testing

Download now

Most Popular

Microsoft is submerging servers in boiling liquid to prevent Teams outages
data centres

Microsoft is submerging servers in boiling liquid to prevent Teams outages

7 Apr 2021
How to find RAM speed, size and type
Laptops

How to find RAM speed, size and type

8 Apr 2021
Roadmap 2021: What’s coming from 3CX
Advertisement Feature

Roadmap 2021: What’s coming from 3CX

30 Mar 2021