Advertisement

Here’s How AI Can Help Find and Fix Hidden Barriers for Underrepresented Candidates

AI and hiring analytics tools can now uncover hidden bias faster than ever before. but it relies on the quality of its training data.

Article main image
Jun 13, 2025

Candidates fall out of the hiring process in only two ways: they’re either rejected, or they withdraw. If your organization is serious about increasing diversity, ask yourself this real question time and again: Where are historically underrepresented candidates dropping out at higher rates than others?

Bias in hiring isn’t always overt. Often, it hides in inconsistent screening, subjective evaluations, or procedural bottlenecks that disproportionately impact marginalized groups. But there’s also good news: AI and hiring analytics tools in 2025 can now uncover these hidden barriers faster than ever before. However, this is also the bad news, as AI can reinforce the bias of the data it is trained on. Therefore directly impacting the success of your hiring programs, and more specifically your ability to increase diversity and retention within your organization.

If you’re not actively tracking where candidates disengage, get screened out, or even overlooked, you might be losing great talent without even realizing it. Let’s break down how to spot these challenges and what to do about them.

Start with Data: Don’t Rely on Gut Feelings

A major reason companies fail to correct hiring inequities is they don’t track appropriate data. Too many organizations assume they’re hiring fairly without looking at the numbers. But research shows that companies that use data-driven hiring insights increase diversity by 30% within a year. Here’s how to proceed:

Step #1: Run a Hiring Funnel Report

Start by collecting at least three to four years of hiring data. Short-term snapshots can be misleading, while long-term trends reveal systemic hiring patterns and potential biases. If your company has a self-identification process (voluntary demographic disclosure), use it. If not, now is the time to implement one—especially since many states have updated hiring transparency laws that require demographic tracking for pay equity reporting.

Key metric to track: Slice and dice applicant-to-interview pass-through rates by demographic group. Studies show that historically underrepresented candidates often pass initial screenings but drop off at higher rates during final interviews. If your hiring funnel report confirms this, it’s a red flag that your later-stage decision-making processes need review.

Step #2: Segment Data by Each Hiring Stage to Pinpoint Where Bias Creeps Into Hiring Decisions

Tracking hiring trends across each stage of the funnel is crucial. However, merely knowing how many historically underrepresented candidates applied is not enough; more important is seeing where they fall out.

Manual hiring reviews can be slow and inconsistent. AI tools now analyze hiring patterns in seconds, identifying bias hotspots automatically. Or better yet, AI-powered applicant tracking systems (ATS) now automate this process, as well. Platforms like Greenhouse and Lever allow recruiters to filter hiring funnel data by gender, race, disability status, and more. This enables hiring teams to identify where certain groups drop off disproportionately.

How to do this effectively: Break down hiring by key milestones.

  • Application received: Which candidates are failing to make it past initial screening?
  • Phone interview: Do some groups get fewer callbacks?
  • Final interviews: Are hiring panels making biased selections?
  • Offer stage: Are certain groups withdrawing at higher rates?

Step #3: Implement and Partner with AI Hiring Analytics

Several AI-driven hiring platforms—including Visier, Diversio, and SeekOut—help recruiters identify disparities in real time by flagging areas where historically underrepresented candidates are dropping off. AI tools can analyze hiring funnel data and highlight disparities in pass-through rates, identify biased job descriptions that deter applicants where you may be underrepresented, predict withdrawal risks by analyzing past candidate behavior, and more.

How to Fix the Interview Process: Standardize for Fairness. Interviews are one of the most common points of bias in hiring. If your data shows historically underrepresented candidates progressing but not getting hired, the interview stage needs attention.

Step #4: Move to Structured Interviews (AI Can Assist)

Unstructured interviews—where hiring managers “go with the flow”—are twice as likely to produce biased hiring decisions.

How to reduce bias in interviews:

  • Ask the same set of structured questions to every candidate
  • Use predefined scoring criteria to evaluate responses
  • Leverage AI tools to monitor interviewer consistency

The best part is that AI can support this work. AI-assisted interview monitoring tools, such as BarRaiser and HireVue, track interview patterns and flag biases in real time. They can detect issues like interrupting certain candidates more often, dismissing non-traditional communication style, favoring subjective “gut feelings” over scoring, and more.

Step #5: Improve Offer Transparency and Experience to Prevent Offer Rejections

Even when historically underrepresented candidates reach the offer stage, many still decline. When we do not take the time to understand why, this becomes a missed opportunity for your organization to refine your DEI programs to effectively increase diversity and retention.

How to effectively increase transparency: Clarify ambiguous salary ranges. Many states now require transparency, yet some companies still delay salary discussions.

Final Thoughts: Hiring in 2025 Requires Data-Backed DEI Strategies

Increasing diversity and retention is not just about bringing in more candidates from historically underrepresented backgrounds. It is about removing barriers. When we prioritize removing barriers for our most vulnerable communities, we keep all candidates engaged throughout the hiring process.