Amazon Reportedly Killed an AI Recruitment System Because It Couldn’t Stop the Tool from Discriminating Against Women

Machine learning, one of the core techniques in the field of artificial intelligence, involves teaching automated systems to devise new ways of doing things, by feeding them reams of data about the subject at hand. One of the big fears here is that biases in that data will simply be reinforced in the AI systems—and Amazon seems to have just provided an excellent example of that phenomenon.

According to a new Reuters report, Amazon spent years working on a system for automating the recruitment process. The idea was for this AI-powered system to be able to look at a collection of resumes and name the top candidates. To achieve this, Amazon fed the system a decade’s worth of resumes from people applying for jobs at Amazon.

The tech industry is famously male-dominated and, accordingly, most of those resumes came from men. So, trained on that selection of information, the recruitment system began to favor men over women.

According to Reuters’ sources, Amazon’s system taught itself to downgrade resumes with the word “women’s” in them, and to assign lower scores to graduates of two women-only colleges. Meanwhile, it decided that words such as “executed” and “captured,” which are apparently deployed more often in the resumes of male engineers, suggested the candidate should be ranked more highly.

Source: Fortune

Leave a Reply

%d bloggers like this:
A DAILY ROUNDUP OF THE MOST FASCINATING WALL ST, COMPLIANCE AND REGULATORY NEWS.