AI recruitment tool pulled by Amazon for sex bias
Recruitment software was not favourable towards women
Amazon has been forced to scrap an artificial intelligence system used by the company for recruitment after it was found the technology was biased against female applicants.
The AI tech was first developed at Amazon in 2014 as a way of quickly filtering out most candidates and providing the firm with the top five people. However, by 2015, it was realised that the system was not rating applicants in a gender-neutral way.
The problem lay in how the system was trained. It was fed CVs to detect patterns in recruiting over a 10-year period. Most of the applications came from men. According to a report from Reuters, the system taught itself that male candidates were preferable to women. It downgraded CVs if found words such as "women's" and penalised graduates of all-female colleges.
While Amazon recoded the software to make the AI neutral to these terms, this did not guarantee that the technology would find other methods of being discriminatory against women, the report said.
The team, set up in Amazon's Edinburgh engineering hub, created 500 models concentrated on detailed job functions and locations. The system was also taught to recognise around 50,000 terms that showed up on past candidates' CVs.
The technology learned to assign little importance to skills common across IT applicants, favouring terms more commonly found on male engineers' resumes, such as "executed" and "captured," the report said.
The model used in the AI system had other problems that led to unqualified candidates being recommended for a variety of unsuitable jobs.
This eventually led to Amazon pulling the plug on the team as executives "lost hope" over the project, anonymous sources told Amazon. The tool could not be solely relied upon to sort candidates.
The firm now uses a "much-watered down version" of the recruiting engine to carry out "rudimentary chores".