Techsploration: What’s Measured in Recruiting Is Not Always What Matters


In one HR job I had, we went through a disruptive technology transition. Of course, pretty much any tech transition is disruptive. Even the best-laid plans don’t work perfectly.
Applicant flow had slowed as we moved from one HR platform to another and we paused job advertising as we made the switch. I had planned for this slowdown, but the change took an extra week due to miscommunication between our IT department and the software vendor. It wasn’t great, but there also isn’t much you can do in the heat of the situation.
We got everything back up to speed pretty quickly and turned back on advertising at a higher rate to catch up. Two of the metrics most important to the organization were cost per hire and time to fill. Guess who didn’t hit quarterly goals because of these measurements?
That’s right — this guy.
Was it my fault that we didn’t hit our metrics? Sure, at least in a way. I could’ve spent an insane amount of money to hit more of our goals. Or I could’ve stayed on budget and missed multiple metrics. Instead, I ended up spending a bit more to get us somewhat caught up.
So, you can call me a little cynical when someone brags about data-driven recruiting in their organization. We’ve been doing poor data in recruiting since the stone ages.
Across recruiting, we see lots of interesting data points frequently cited as “must-know,” including:
Many of these metrics are simply noise. They are presented in talent acquisition solutions because they are easy to track and easy to create pretty graphs for. They also give you an illusion of a data-driven process.
For example, time to fill can tell you how long it’s taking to fill a position. Is that useful? By itself, no. It’s a data point that needs an enormous amount of context to understand why that number is what it is.
Let’s say overall time to fill went down but you’re a retailer that just went through a holiday hiring blitz. How did that decreasing metric help you with the three open roles in logistics that have been open for months? Or maybe it went up but you’re killing it on wrapping up some longer-term reqs after a competitor went out of business? Or it went up because your technology was down for 15% of the working days of a quarter. (Yeah, I’m not letting that go.)
Metrics that can be auto-generated from an ATS don’t give you the full picture. They don’t bring together why the numbers are the way they are, and they don’t tell you what you need to do to fix issues.
You aren’t going to find the answers to your most pressing issues on an ATS reporting tool. That data is, perhaps, the very beginning of knowledge. More often, it can distract from what’s truly driving the green, yellow, and red dashboard metrics you see as you log in.
Here are two metrics that are harder to concretely measure in the context of recruiting but are actually useful in diagnosing issues.
Many organizations spend so much time wringing their hands about this measurement because they equate quality of hire with grabbing a bunch of A players or unicorn candidates. Quality is about delivering on hiring-manager, candidate, and organizational expectations.
For many organizations, it can be a combination of a number of measures.
Not all of these will be important to all organizations. You won’t get pretty graphs with them. But they will tell you a lot about what’s happening in your hiring process and where you can improve.
You might not like this one as a recruiter because so much goes into organizational performance. But a great recruiting team is at the heart of almost all great organizations. Again, these are multiple measures combined.
Will these result in happy graphs and infographic-friendly stats? Probably not. Can technology help you figure some of these out? Absolutely.
The metrics cited over and over again as important to recruiters are the ones that can be easily measured. It’s also what most software vendors sell as part of their reporting packages. Contextualizing that data to focus on what actually matters requires effort. Ultimately, though, it will help you figure out what’s working and what to do next to improve.