“Recruiter admitted today he cannot find anyone to hire because companies are rejecting all the qualified candidates based on AI
“Had a call today with a hiring manager at a large pharma company who said “We have tons of openings for X, but there are no available candidates for X because we can only hire people who have a very, very specific type of experience with [sic] is non-existent. Meanwhile we have lots of qualified people with related skills which are directly transferable, but I’m not allowed to hire them. So I have a ton of unfillable openings.”
Contrary to the poster’s headline, this is not a problem with AI. This is a problem with the recruiter’s supervisor. If you tell the AI (or the standard Boolean search) that you need skill Y, which is transferable, then you’ll find people.
More than a decade ago, Wharton professor Peter Cappelli shared a story about a company that had 29,000 applicants for a single position, and yet the applicant tracking system rejected all of them. Again, this is a problem with the recruiter and not the system.
AI is like any computer program: When you put garbage in, you get garbage out.
Is AI Rejecting Candidates?
Almost surely, as ATSes have jumped on the AI bandwagon. But, when they do, they don’t explain exactly how they select candidates. James Zuo, a computer science professor at Stanford, explained that with AI and specifically ChatGPT, “These are black-box models. So we don’t actually know how the model itself, the neural architectures, or the training data have changed.”
This should be a concern for talent acquisition professionals and hiring managers because they remain legally responsible for their decisions — except they don’t know precisely how AI decides.
Writing in Forbes, leadership professor Benjamin Laker states: “The use of AI-mediated job interviews can also lead to a “ghosting” effect. Candidates who have been successful during interviews may never hear back from the employer, leaving them anxious and uncertain about their prospects. This phenomenon is an enormous problem in the job market today, and it has consequences that go beyond frustration, according to research.”
That sounds terrible, but Laker doesn’t explain why this anxiousness around ghosting from AI-powered recruiting is any different than the ghosting that recruiters and hiring managers have been doing for decades without AI technology. Much like putting in incomplete criteria is a human problem, ghosting is also a human problem, not an AI problem. A human decided not to respond to candidates after a chatbot interviewed them or whatever AI’s role was.
How Are Recruiters Using AI?
Robyn J. Grable, CEO of Talent Ascend, uses AI in her business to help match former military people, among others, with civilian jobs. “AI can match candidates and careers across millions of data points in the blink of a human eye,” she explains. “This gives recruiters and talent acquisition teams the advantage of knowing how well a candidate fits the skills needs across the organization, not just one job/one resume at a time. Their time is more productive and fulfilling.”
One of Grable’s associates, U.S. Army veteran Scott Stafford, adds, “I think using AI to find candidates for job opportunities is a great way to get the right people matched with the right opportunities!”
On that note, LinkedIn, the largest database of potential candidates, uses AI technology to help find “recommended matches.” Hari Srinivasan, VP of Product at LinkedIn, explained that not only do LinkedIn’s AI features help recruiters write messages, they will find the people for you to message. Srinivasan said:
“Most recently, we’ve used AI to update our recommended matches feature so that you can see real time personalized recommendations based on your hiring activities, candidates’ job seeking activities, and key information from your job posts, helping you to discover up to 10% more new and qualified candidates that you’re less likely to find on your own. And we’ve found that if you use recommended matches — rather than Recruiter Search alone — candidates are 35% more likely to accept your InMails.”
On the other hand, Joan Kennedy, a senior talent acquisition partner, has a more negative view. “When we think about using AI in hiring,” Kennedy says, “often the focus is toward tech giants like Amazon, but in the U.S., the largest number of companies are small-to-medium-sized businesses. Not everyone hires high volume or can risk branding themselves as chatbot impersonal. Often it’s a poor candidate experience. Another concern I have is letting an algorithm perpetuate bias and discrimination in hiring decisions, which research supports [that AI] does.”
Kennedy isn’t alone in her concerns. In September, Nature published a study about AI in recruiting, which concluded: “The findings suggest that AI-enabled recruitment has the potential to enhance recruitment quality, increase efficiency, and reduce transactional work. However, algorithmic bias results in discriminatory hiring practices based on gender, race, color, and personality traits.”
So while the Redditor’s concern over AI in recruiting is misplaced, that doesn’t mean there aren’t problems with AI technology in recruiting. Ultimately, the problem is not knowing how AI is arriving at decisions — and not even having a human to blame.