How to Reduce AI Bias in Hiring How to Reduce AI Bias in Hiring
How to reduce AI bias in hiring. If this title causes you some discomfort, that’s probably a good thing. We tend... How to Reduce AI Bias in Hiring

How to reduce AI bias in hiring. If this title causes you some discomfort, that’s probably a good thing. We tend to view our computer tools and algorithms as free of the destructive biases we have as humans. In reality, our algorithms often mirror the unconscious biases we already have—case in point, AI in human resources.

[Related Article: Apple Pay Card’s Credit Determining AI: Gender Biased?]

Implementing AI in the hiring process was supposed to eliminate the unconscious biases we have against others that prevent good people from getting jobs. Instead, AI doubled down on those same unconscious biases. Here are three things you must consider when implementing AI in your hiring strategy in order to reduce AI bias.

Examine Data Carefully

Building machine learning algorithms for the hiring process is the first step to implementing AI into your hiring process. We see data simply an objective set of information, but machines take the same shortcuts as humans do when learning. If we don’t get diverse data for training, it can be disastrous.

For example, Amazon discovered that its much-touted hiring software learned quickly that male applicants tended to have more experience than female applicants when it came to the tech sector. It began filtering out women’s resumes pre-emptively as a shortcut because of that knowledge, nixing resumes that mentioned all women’s colleges, for example. 

This isn’t the data’s fault, but it’s up to your human team to ensure that data used for learning doesn’t accidentally teach machines something else. Diversity for diversity’s sake isn’t good enough either. Your positive and negative responses must include diversity and frequent checks to ensure that you aren’t replicating our unconscious biases in algorithm form.

Do this: Stress the need for better data and build continual checkpoints into your hiring pipeline.

Work with Companies that Explicitly Favor Diversity

Since many of you won’t be building an in-house solution (even if that’s the dream), you should keep one eye on your prospective consulting company’s track record and the other on diversity. Since data alone isn’t enough to block potential hiring bias, you’ll need more than just a “data speaks for itself” approach.

Quite a few companies are taking these horror stories seriously in an effort not to be the next embarrassing headline. Ask your companies what they’re doing explicitly to target bias and pay close attention. If that company doesn’t have an explicit answer, it might be a sign to move to the next.

You might consider a small but growing number of consulting companies that work to ensure your AI-integrated hiring solutions are free of bias. Plus, adding diversity to your hiring team for fresh perspectives helps ensure that your own blind spots are covered.

Do this: Favor explicit diversity solutions with companies you work with and within your own company.

Consider an Augmented Perspective

We’ve seen over and over the concept of “augmented” intelligence rather than “artificial” intelligence. Your tools should make your HR team better, not replace HR. Hiring algorithms can be set to ignore explicit demographic information, for example, but your human team should notice if resumes are all flagging the same to find out why.

In Amazon’s case, it was the human team who noticed that resumes were overwhelmingly male and zeroed in on why that was. Instead of assuming data was entirely objective and going with it, Amazon scrapped the whole project because it just wasn’t ready yet. 

Even companies that work within the AI-driven hiring space don’t recommend hiring based solely on the algorithm. Human recruiters must maintain their sharp skills, deciding to give certain out of the box candidates a chance and continuing to refine the still tricky process of finding the perfect match. AI-hiring initiatives could help recruiters and hiring teams to work more strategically overall, increasing a company’s chances of finding the right candidate and not just the typical one.

Do this: Allow your tools to make your hiring process better and your humans to make your tools better.

AI with a Human Perspective

Our tools are not quick fixes for hiring problems. AI can help solve your hiring team’s myopic approach to resumes, but it takes higher-order thinking skills to figure out when there’s a problem with the data training machines.

The augmented approach could help place AI in perspective, ensuring not only that your team’s implicit bias doesn’t derail your hiring process but that your tools don’t come back to haunt you. The human element is a vital part of handling AI properly for now, and just like all other AI applications, it’s not meant to replace human problem-solving.

Consider the basic three-step approach to implementing AI into your hiring process:

  1. Watch your data like a hawk
  2. Contract and partner with diversity minded organizations
  3. Let the tools make you better, and—here’s the key—vice versa.
  4. Find the right candidates by expanding your pool strategically.

[Related Article: 9 Common Mistakes That Lead To Data Bias]

Happy hiring!

Elizabeth Wallace, ODSC

Elizabeth is a Nashville-based freelance writer with a soft spot for startups. She spent 13 years teaching language in higher ed and now helps startups and other organizations explain - clearly - what it is they do. Connect with her on LinkedIn here: https://www.linkedin.com/in/elizabethawallace/