Preventing human and AI bias when recruiting

Author: Victor Assad

Source: Preventing human and AI bias when recruiting

Many pundits and HR leaders are raising legitimate concerns about bias in Artificial Intelligence platforms used for recruiting. These concerns are well founded, but let’s not forget a hiring bias that’s even more pervasive: the biased hiring decisions humans unwittingly make every day.

We can correct human and AI bias. Here is my analysis:

For decades industrial and organizational psychology have conducted empirical studies to provide us the best methods for screening job candidates in ways that reduce human bias.

The problem is most companies and executives ignore these findings and recommendations, and instead trust their own instincts.

Research dating back 65 years shows that first impressions in the initial four minutes of an interview play a dominant role in shaping an interviewer’s decision, establishing a bias that colors all subsequent interviewer-interviewee interaction.[i] In many ways, the interview is a search for negative information, and just one unfavorable impression is followed by a rejection 90 percent of the time.[ii] First impressions are influenced considerably by nonverbal cues, such as direct eye contact, smiling, attentive posture, smaller interpersonal distance, and a direct body orientation.[iii] A more recent study found that first impression judgments are made in the first ten seconds, and the hiring manager spends the rest of the interview trying to confirm the first impression.[iv]

Bias against women and people of color is real.

Research shows that we tend to hire people who look like us. So male bankers tend to hire more males, and female teachers tend to hire more women.[v] Male and female stereotypes are still prevalent in today’s workforce despite the progress made by women.[vi] The stereotypes tend to hurt women more than men.

An American Economic Review study in 2004 found that job seekers with resumes that had so-called white-sounding names received 50 percent more callbacks for interviews. Names such as Jamal or Lakisha or others that are perceived as black-sounding names received fewer callbacks. That racial gap is uniform across occupation, industry, and employer size.[vii] A study on gender bias in recruiting released this year found similar findings. Resumes with female sounding names or gender-neutral names tended to be rejected more often even though all the resumes in the study were EXACTLY THE SAME, with the exception of the names and email addresses.[viii]

Companies can increase their ability to hire excellent candidates by using a combination of the best screening methods. Based on empirical research, the use of a general mental aptitude test with a structured interview improves the ability to hire great candidates. A general mental aptitude test with a personality test that measures integrity and consciousness also helps select better job candidates. Next in line are the following: job knowledge tests (such as software programming test), work sampling, job tryouts for less skilled roles (which usually include a ninety-day probation period), peer ratings (which you almost never see when hiring an external candidate), a training and experience behavioral consistency model (which today is more commonly called job competency model), and reference checks.[ix]

I have successfully used structured interviews, general mental aptitude tests, personality tests, and work sampling to improve selection decisions and reduce human bias. AI has the promise to make further improvements in selection decisions and significantly improve the job candidate experience.

In researching my upcoming book, Hack Recruiting, I was impressed by how Artificial Intelligence can significantly improve the administration of recruiting, communications with job applicants, and significantly improve the ability of quickly finding job candidates online.

I saw how the use of the AI chatbot AllyO, with its digital automation of applicant tracking systems and ongoing text messaging with job candidates can do something recruiting departments never dreamed of: dramatically shortening their recruiting cycles, saving costs, and having a net promoter score of 90 or above among rejected job candidates.

I saw how the recruiting AI platform of ThisWay Global, which was under development for three years  to create its thorough ontologies, was able to find qualified job candidates in seconds that were previously not found by my clients after weeks of search. ThisWay Global founder, Angela Hood, who was interviewed for Hack Recruiting, recommends her tool, AI4JOBS, be used with its screens that cloak the names of job candidates to guard against human bias. Angela also believes that in time, AI platforms will leapfrog the work of a previous generation of I/O psychologists to improve our understanding of job competencies that drive the performance of our best employees—without biases.

Human resources leaders need to be wary of digital technology to make sure it is effective and does not have biases embedded in their algorithms. Amazon’s admission that it stopped using an algorithm to select software engineers because it screened out qualified, female applicants for providing long answers to questions is a warning to us all. How did the algorithm learn how to screen? From its human, mostly male, programmers.[x]

Companies that tout the use of facial recognition, tone of voice and body-motion detection technologies should also raise our scrutiny since the academic research to date suggests these technologies quickly lose their accuracy when interviewing non-white males.[xi]

When considering an AI platform to help find and select talent be sure of the following:

  1. They are committed to eliminating bias in job searches in their own algorithms and have taken concrete steps for doing so.
  2. They have taken the time to thoroughly develop ontologies for their AI platforms.
  3. Be wary of facial recognition, tone-of-voice and body-movement-detection technologies used in screening. These are new and evolving technologies. Follow me as I monitor the independent academic research of these technologies.
  4. In the years ahead, as more aggregate empirical data is gathered look for and demand transparency on the effectiveness and anti-bias measures of AI platforms, and that it be published on their websites, in whitepapers or academic journals.

Small and large companies can reduce the biases of their recruiters and hiring managers, improve their speed to hire, and reduce their hiring costs by following the long-standing research of I/O psychologists and wisely selecting competent, ethical AI platforms.

Don’t be afraid of AI because of the stumbles of early adapters.

The AI tools marketing departments have been using  for years to improve company brand and to find new customers is now available to HR.

It is time HR  learns to use it wisely

Summary
Article Name
Preventing Human And AI Bias When Recruiting
Description
When hiring new employees, organizations often face challenges such as bias, discrimination, and unconscious bias. Learn how to prevent these problems from happening during recruitment!
Author
Publisher Name
ThisWay Global
Publisher Logo
Get Started

Increase diversity within your organization, ThisWay!

Download Our Guide

A 3 Step Guide For Creating a Genuinely Diverse Company Culture

artificial intelligence for hr