Using tech to mitigate implicit bias in hiring
February 24, 2021
Viktor Nordmark
Everyone is biased. It's a fact of life, and it has implications in every area of our lives. Bias can be both conscious and unconscious, but most biases are typically unintended and stem from deep-seated social beliefs that are difficult to overcome.
As a recruiter and fellow human in charge of people’s futures, you have to be incredibly open-minded in order not to side-step into non-beneficial beliefs of your own. It’s not an easy feat, especially not since we’re unaware of our own biases most of the time. But luckily, and for everyone’s sake, humans are pretty good at developing highly advanced tech that can even help us save ourselves from ourselves. Such as tech that decreases biases from hiring decisions. Learn how tech can support you in reducing your own biases during recruitment.
What are biases?

A cognitive bias is an inclination or prejudice for (or against) one person, group, or concept, especially in a way considered unfair. It’s a flaw in a person’s cognition that allows her to base her reasoning on personal beliefs, rather than on facts, logic, or assessments. In many areas of our society, biases have started to rise to the surface, so to get a hold of what they are, let’s uncover some biases connected to recruiting;


Gender bias

A set of implicit biases that discriminate against gender. One is that women would be less suited for jobs that require high intellectual ability.


Law of the instrument

An over-reliance on familiar tools or methods makes you less prone to try new approaches, even though they prove to be better than what you’re used to. Something that can occur when deciding to trust a computer to make hiring decisions.


Confirmation bias

The tendency to only search and focus on information that supports your case. For example, suppose you’d hire for a company, and you have a preconception of what kind of person they need. In that case, you’ve already, without knowing it, disregarded several suitable candidates.

We, as humans, are slowly starting to understand the immense negative impact biases have on our quest for a just society, and that’s only based on the biases of which we’re conscious. The biases that have not yet risen to the surface are called implicit biases and are causing harm to people on a daily basis. Those are the ones that we’re all individually in charge of uncovering within ourselves, so that we don’t, accidentally, create unjust situations.

A Hubert Guide  Recruiting science: The structured interview – A high volume hiring approach  How does the structured interview compare to other screening methods? And how  can technology enable all candidates to be interviewed in high-volume settings?  Download Guide
How do your own implicit biases affect your candidates negatively?

There are many biases at work when you’re in the pre-employment phase. Even though you may feel that you’re looking solely at the person’s competence, experience, and skills, we can assure you that your mind is making up all sorts of ideas as to why you should hire one person over another. Let’s dive beneath the surface of implicit biases and see if we can find out what they are, followed by what type of technological solutions you can use to minimize the impact.


The Contrast Effect

The main goal of screening CVs is to compare them to the job description and find the candidate who matches it the best. But as you screen them immediately, one after the other, you’ll involuntarily begin to compare the resumes to each other. You’re no longer looking for the best-suited candidate in terms of job match; you’re looking for the best candidate compared to all other candidates. This phenomenon is called the Contrast Effect, a phenomenon that unfortunately spills over into other biases. Because, once you’ve found your “perfect candidate,” what did you base it on? The Similarity bias, e.g., you picked the person who’s the most similar to you. Or maybe it was the affinity bias, the one that whispers that you should choose the person who grew up in the same neighborhood as you?

Countered by the AI Screening Effect

One way of getting around this is to use an AI-powered screening tool that is an intelligent tool that screens applications for you. The AI focuses solely on the candidate’s competence, experience, and skills. The difference is that you, involuntarily, would take the candidate’s age, ethnicity, and gender into consideration and judge them by that, whereas an AI takes only factual information into account. Although no one has managed to construct a completely unbiased AI, it will most likely be less biased than you are. Therefore it will be better than you at judging people solely on their competence, experience, and skills and not on their looks, age, gender, nor ethnicity.

The Attractiveness Bias

It’s the tendency to believe beautiful people are more intelligent and thereby more successful than less so. We’re all wired to enjoy handsome people, and even though we all have our beauty preferences, some people have been given faces that are considered classically beautiful. With that being said, the sheer fact that you find someone beautiful, not attractive per se, is enough for you to vouch for that person a little extra. So, how do you shield yourself from beauty?

Countered by Intelligent Chatbots

You could ask the applicants, in the job ad, not to attach a photo of themselves, but you could also integrate a middle hand between you and the candidate, such as an intelligent, conversational chatbot. When the candidates apply, they do so directly to the chatbot platform, and not to the recruiter. After the chatbot has done a semantic match and decided which candidates it wants to move forward with, it conducts a written interview with each candidate. The chatbot will not consider the candidates’ age, looks, name, gender, nor ethnicity. Instead, it will create a shortlist of the favorable candidates, based on their assessed knowledge, experience, and skills. Although the chatbot can’t be completely unbiased, it will be more so than a human recruiter is, but it’s still essential that you, as you go through the shortlisted candidates, read the results with fresh eyes and a pinch of cynicism.

A Hubert Guide  Recruiting science, The structured interview – A high volume hiring approach  How does the structured interview compare to other screening methods? And how  can technology enable all candidates to be interviewed in high-volume settings?  Download Guide


The new workforces

We’re not using conversational chatbots as an example only because we are constructing one, but because they are indeed the closest thing we can get to unbiased recruitment. A written interview, assessed, and conducted by an AI is the optimal way of reducing the number of biases. Apart from the biases already mentioned that revolve around gender, age, ethnicity, etc., you also manage to avoid the less apparent biases connected to disabilities, voice, and speech. Since you can’t see the person and don’t know what the person looks like, not even from a photo, you increase the chances of hiring everyone, regardless of potential disabilities or not. Regarding voice and speech - two factors that reveal a lot about a person, remove the two, and you’ve managed to avoid revealing gender, speech impairments, age, and accent.

People who are programming AI’s for recruiting are hellbent on finding ways to make them operate as unbiased as possible. If, when, we succeed with this, we’ll begin noticing shifts in workforces all around the globe. We’ll see more people of different ages, ethnicities, and capabilities working together, and everyone will be given more of an equal chance from the start.

Implementation period
Insight
Using tech to mitigate implicit bias in hiring
February 24, 2021
Viktor Nordmark
Contact
Give us a call
General inquiries
hello@hubert.ai
Swedish office
Vasagatan 28, 111 20 Stockholm, Sweden