Advertisement

What We Should Take Away From Amazon’s Embedding Bias in Its Algorithm and Stopping the Project

Article main image
Oct 11, 2018
This article is part of a series called Opinion.

A while ago I wrote an article warning recruiters about embedding their bias in their selection algorithms. The problem, as I described, is we all have a bias, usually unconscious. Just look around you right now at your department. Is this a good representation of your community in terms of gender, age, ethnicity, and sexual preference? Probably not. And that’s probably not because you’re recruiters are racist. It’s because unconsciously we all have a bias and in many cases in our greater society.

When a boy fails his math test he gets told he needs to try harder. When a girl gets the same grade she hears ‘it’s hard, isn’t it?’

When a man takes the lead, he’s a leader, when a woman does she’s bossy. This has its effects on the number of women in IT and female leaders.

I’m not saying everything needs to be in perfect balance. I am saying we shouldn’t widen the gap further with embedding these biasses in our recruitment process!

Amazon’s Algorithm Was Sexist

So if we train our algorithms based on human behavior, they will have our human flaws. Amazon found this out almost three years ago. And it tried to fix it, and that didn’t work, so it abandoned the project all together, Reuters reports (see discussion on the ERE Facebook group).

In short: Amazon tried to get an algorithm to rate people on their resumes. The basic idea was: put in 100 CVs and give me the five best. Rate them with stars, like our books are rated. And use our normal selection data as the basis for selection.

The algorithm saw that a lot more men were selected at Amazon than women, so it rated specific female features lower. They could have seen this one coming, to be honest. And like I said before, I don’t think the people at Amazon are sexist. There are many more male developers in the world, so it’s logical that Amazon has more male employees, especially in technology. But the algorithm rated “chair of women’s chess club” lower than ‘chair of chess club,” for example.

This was easily fixed, but the problem is much deeper. Women use different language than men do. So job descriptions were different as well. It was a mission impossible to fix the bias by adjusting the algorithm.

Amazon now decided to scrape the project and as far as I’m concerned it should be praised for this … Both for trying as for scrapping. We can all learn from this.

Can We Select Based on Resumes?

I think the first question we need to ask is if a resume is actually a good way to select the right candidate.

No.

A CV tells me what you have done, for whom, and how long. It leaves out the two best predictors of future success: how well did you do, and under what circumstances did you do well?

No Problem with AI

There is no problem with selecting people using the help of AI. You just need the right data to select people on. First of all, you cannot use data that has proven racist or sexist in the past, like a resume. Only if you current population of employees proves no bias on race, gender, sexual orientation, or age in hiring (especially for age — it’s important to know the age of recent hires), can you train your AI based on your current behavior.

We can however train our AI based on new data that we haven’t used before. Think assessments. Think aptitude tests.

Conclusion

We should all thank Amazon for helping us realize our flaws. If even the genius developers at Amazon cannot built an AI that selects based on a CV that isn’t sexist, we can pretty much be sure none of us can. We should look at better ways of hiring.

This article is part of a series called Opinion.
Get articles like this
in your inbox
Subscribe to our mailing list and get interesting articles about talent acquisition emailed weekly!
Advertisement