There has been a lot of ink spilled on hiring processes that use AI for end-to-end recruiting that fall short by magnifying the biases inherent in the data record. With a more fine-grained view of the process, targeted AI applications and tools applied and implemented with an eye toward equity and efficiency could actually help move our hiring practices in a positive direction. Here’s how you can take a fresh look at rebuilding your hiring process with the latest AI tools.
Automate and Standardize Outreach to Eliminate Explicit biases
“I got one of your email alerts last night,” the user’s email began. I had just implemented and rolled out a new search algorithm. One of its functions was to match jobs to candidates and send email alerts. Since it was a new algorithm in the early days of matching, I was one of the lucky individuals who got emails directly from users that, like the typical high-volume feedback loop, was filled with invective and detailed several specific reasons why I should quit my job. But this email at 3 a.m. was from a single mother of two working two jobs who said she had applied and actually started working at a new job that paid more. This allowed her to quit her bartending position and get to see her kids every day.
To me, the power of a good AI system is that it is tireless and systematic. It doesn’t skip over people because of their perceived gender, race, religion, national origin, how they look, or simply if they have an unusual name. People often do, though. On everything from responding to email, to forwarding resumes, people bring a lot of biases to the equation with the result that opportunities are not even presented to all potential candidates, while employers lose out on great talent.
Counterbalance Implicit Biases and Perceived “Risky Hires”
In many instances, our biases are implicit. You and I are often unaware we even have them. The same skill or accomplishment on a woman’s resume is perceived as boastful or arrogant but touted to propel a man forward in his career. This type of implicit bias is often accentuated when jobs seem to go against gender norms. Women, for example, applying for outdoor or dangerous occupations like truck driving and forklift operation, or even more perilously, math.
AI can have a strong role to play here in two ways.
First, by quantifying the matched elements such as relevant certifications, experience, education etc. into a rolled-up, summarized indicator or score, usually computed by valuing the sub-elements consistently using role and industry-specific weights. (Something that humans have difficulty doing even when unbiased and handed a large calculator).
Second, AI can showcase and help visualize the areas where one candidate matches the role and where they don’t. These techniques can help mitigate a person’s implicit biases in selecting one candidate over another. The visual presentation is key. If you can look at two candidates from 10 feet away and immediately see that one is a better match or that they are both equal, it helps overcome your implicit biases even if you’re not conscious of them.
Standardize Your Interview Processes
This is a variant of the implicit bias. But it has specific implications for women trying to land that amazing opportunity. For most people, phone or in-person interviews are stressful where you’re trying to showcase your best achievements rapidly while trying to strike a balance between warm likeability and cool confidence. For women, the hurdles (and the advice out there) are farcical and contradictory. Sell yourself, but don’t appear needy. Wear makeup, but look your natural self. Dress conservatively, but be stylish. A personal favorite: “Combine niceness with insistence” and be “relentlessly pleasant.”
A standardized, intelligent interview process that you can complete on your own time can help you walk through your accomplishments at your own pace. In some cases, you may even get a chance for a re-do if you flubbed an answer or made an error. This doesn’t apply to case interviews or skill tests and coding challenges where timed responses are part of the test. But in many other cases, a considered thoughtful response to intelligent and relevant interview questions would help many women — and men — who do not deal well with high-stakes interviews, but might be excellent at their job.
Wake Up to Wage Inequity and Negotiation Biases
Today’s wage estimates from the U.S. Bureau of Labor Statistics and other vendor sources are useful for trend analysis and to estimate overall market comparisons, but in many cases the range of high and low salary provided is substantial and significant. Many women might place themselves at the low end of the scale. Men ask for and get wages closer to the middle or top quartile.
Advanced AI and analytics tools can help define a candidate’s unique mix of skills and provide managers a much more granular range of expected salary for the position. With recent legislation in Massachusetts and California preventing companies from asking for past salary, AI techniques can be used to determine the optimal wage for a position based on real-time supply and demand, recent hires, plus current pay rates for those skills in that location. Informing the leeway and discretion that recruiters and managers have with actionable and relevant data can help level the playing field for women receiving job offers.
Address Information and Network Asymmetry
In many cases, people find jobs and learn about new opportunities through the strength of their personal and professional networks. These networks tend to be asymmetrical between men and women for a variety of social, cultural, and historical reasons. For example, even though 60 percent of all university graduates are women, business schools still lag behind with only 38 percent enrollment. This means that sharing a job with just your business school group could inadvertently remove a large chunk of the talent pool from even hearing about the role. This effect is more pronounced if you’re looking for experienced hires.
At the same time, managers and recruiters shouldn’t need to spam their entire network each time their company posts a job. Using AI and semantic logic to intelligently communicate to your network based on their qualifications for the job will help both broaden the pool of available candidates as well as allow candidates with non-traditional backgrounds to stand out. For example, if you’re hiring for UX talent, AI could suggest including fashion or interior design in the education requirements.
For women looking to re-join the workforce, learning the latest buzzwords and terms used in the industry can be awkward and difficult. Using terms and descriptions that have dropped out of usage can make your experience seem dated. Data analysis techniques can show the terms that are used frequently for a given role as well as their trend over time to help get ramped up and have a confident conversation with a manager or recruiter.
Article Continues Below
ERE Media Survey: Is Talent Acquisition Influential?
ERE is conducting a survey to answer those questions. It takes only 5 minutes but the results will make a world of difference.
Build Up Your Candidate’s Confidence
Women negotiating for a raise, a job offer, or even applying to a job face hurdles of being the “only” person representing a whole group or experience the dreaded “imposter syndrome” of not belonging to the team or company. This may lead many women to not even apply to a new role that they believe is out of reach.
Applications of AI that show a candidate that they match five out of seven key skills for the role may lead them to consider the opportunity more objectively. Knowing their strengths and where they are placed in the overall applicant pool also helps candidates’ confidence during their interview with hiring managers. Staffing firms can use this type of AI to coach candidates, ensure they highlight their strengths, and boost their visibility to help improve their success rates.
Address Posturing and Misrepresentations Early in the Process
The least appealing experience in selecting and scheduling candidates for interviews is when the manager realizes that their accomplishments on paper don’t match their actual expertise — in the first three minutes of an in-person interview. Twenty-seven painful minutes later, they’re on the phone with procurement ensuring you never submit another candidate even if he’d just recently cured cancer in Q3.
AI screening systems can help you weed out the candidates who check every box and select every drop down without a qualm. For example, a recent research study analyzed data from the Organisation for Economic Co-operation and Development math assessment that includes some fake skills in a list of actual terms to figure out who was bluffing about their amazing quant background. To everyone’s surprise and bafflement, men won that contest hands down! If your candidate is someone who claims expertise in Declarative Fractions, AI can help you safely hit pause instead of the submit button.
Engage Proactively With Stakeholders on Controversies and Questions
There are well-documented instances where using rote machine-learning for matching yields poor results or reinforces past practices and biases, including penalizing graduates of women’s colleges. Machine learning and unsupervised learning techniques are really great at extracting value from columnar data and numeric fields where the definition, context, and limits of the underlying data set are well-defined.
In the current state of data and systems for recruiting, where technology is evolving rapidly and innovation is at its peak, vendors and employers must take care to look deeply into the data pre-processing steps performed when dealing with fully unstructured data and other cases where the context may be unclear.
Some of these pre-processing steps might be tasks like removing personal information, gender etc. from the raw data, eliminating or counterbalancing elements that have known biases — MBA degrees perhaps, school names, references to golf, boy scouts, or career gaps that may be positive or negative. Second, “black-box” scoring and ranking models where the algorithm is the final arbiter should be substituted by recruiter-specific and job-driven weighting and ranking of the elements. The more easily that users can look behind the curtain and see how changing a particular requirement re-ranks their list, the better that outcomes will be overall. Third, technology vendors and large employers should both offer and push for openness and transparency in their algorithms.
There is still a lot for us to learn about the power of AI and ML and how to wield it wisely, but even initial results suggest the possibility of driving real change and better outcomes for all of us.