MIT research finds ethnic and gender bias in Amazon Rekognition

However, AWS said MIT's testing was "ill-advised" and its software not used in the way it was intended

Face being scanned by facial recognition tech

The Massachusetts Institute of Technology (MIT) has discovered that Amazon's Rekognition facial recognition platform may not be identifying race or gender accurately or fairly. A report by the scientific research facility said tests it has conducted on the technology found that Rekognition was less effective at identifying some races and genders compared to others.

For example, it mistakenly identified some pictures of women as men and this was more prevalent when presented with pictures of darker-skinned women. In fact, 31% of the time, it made this wrongful conclusion, compared to an error margin of 1.5% with Microsoft's alternative software.

Advertisement - Article continues below

However, AWS said that its software isn't true "facial recognition" software, but "facial analysis". It has been designed to identify facial expressions rather than ethnicity or gender and that's why it's less accurate than its competitors.

"[F]acial analysis [is] usually used to help search a catalog of photographs," Dr. Matt Wood, general manager of deep learning and AI at AWS said in a statement to VentureBeat. "[F]acial recognition is a distinct and different feature from facial analysis and attempts to match faces that appear similar. This is the same approach used to unlock some phones, or authenticate somebody entering a building, or by law enforcement to narrow the field when attempting to identify a person of interest."

Advertisement
Advertisement - Article continues below

He added that it is "ill-advised" to use the software in the way that MIT did because it has not been designed to identify criminals. However, in tests with its latest version of the software, AWS said it used data from parliamentary websites to test accuracy and had no false positive matches with the 99% confidence threshold.

However, it seems the company needs to communicate its software's purpose to its shareholders better as some have requested that the company stop selling its facial recognition service because they feel it violates human civil rights.

Advertisement

Most Popular

Visit/security/cyber-security/355200/spacex-bans-the-use-of-zoom
cyber security

Elon Musk's SpaceX bans Zoom over security fears

2 Apr 2020
Visit/development/application-programming-interface-api/355192/apple-buys-dark-sky-weather-app-and-leaves
application programming interface (API)

Apple buys Dark Sky weather app and leaves Android users in the cold

1 Apr 2020
Visit/data-insights/data-management/355170/oracle-cloud-courses-are-free-during-coronavirus-lockdown
data management

Oracle cloud courses are free during coronavirus lockdown

31 Mar 2020