Advertisement

Are We Ready for Self-driving Talent Assessments?

Article main image
Jun 27, 2017

Technology is radically changing the status quo in industries across the globe. In some cases, exciting new technologies are a no brainer, driving rapid adoption (think smartphones). In other cases opinions remain divided due to either technological limitations or general fears about the potential impact of the technology (think autonomous/self-driving cars).

Autonomous vehicles are a technology that will inevitably be part of our future. A recent study by Stanford Economist Tony Seba suggests that by 2030 autonomous electric cars will completely replace the cars on the road today, forever changing the face of transportation.  

Autonomous cars present a ton of practical, legal, and social issues making their entry into the mainstream a subject of great debate.

While not as sexy or mainstream as autonomous cars, there are a lot of parallels between the future of transportation and the future of talent assessment. As with many industries these days, technology is threatening to turn talent assessment on its head. And just like with autonomous cars, there is much debate surrounding the road ahead for the incorporation of technology-driven automation into talent assessment.

The talent assessment space has always been confusing to the un-indoctrinated. Unfortunately, things are not getting any easier as advanced technologies are adding new layers to the confusion by rapidly introducing new concepts to the mix.  

The “engine” driving change in both transportation and talent assessment is artificial intelligenceThere has been a recent explosion of investment in AI-based technologies in the recruitment and talent-acquisition industries. While talent assessment tends to lag behind trends in the general talent acquisition space, AI is definitely big news in this segment of the industry as well.  

Artificial Intelligence is broadly defined as: Intelligence exhibited by machines. In computer science, the field of AI research defines itself as the study of “intelligent agents“: any device that perceives its environment and takes actions that maximize its chance of success at some goal.[1] Colloquially, the term “artificial intelligence” is applied when a machine mimics “cognitive” functions that humans associate with other human minds, such as “learning” and “problem solving”.[2]

AI boils down to using machine-based intelligence to “get things done” in an autonomous manner. Probably the most common form of AI in the realm of talent assessment at the moment is machine learning.  

Machine learning is the process by which algorithms learn from sample input (or “training” data) in a way that allows them to break free from any scripted program and make data-driven predictions or decisions on their own. Machine learning requires training a computer to understand and classify information such that it can teach itself to do a more effective job as more data is collected.   

Machine learning has been around for quite some time, but advances in computing power have accelerated its use, allowing it to be embedded into complementary technologies. Machine learning is becoming big business, using its predictive powers to help humans across the globe solve all kinds of problems  But its ideal use cases presents more limitations in some industries that it does in others. Predictive decision making is an excellent example. While AI is perfectly suited to handle the decision making needed to drive a car autonomously, it is simply not ready to fly solo when it comes to predicting the future success of job applicants.  

Prediction and predictive analytics is the ethos of the talent assessment industry. I/O psychologists have been using assessments (i.e., “predictors”) to support hiring decisions and analyze their business impact for almost 100 years, though the combination of big data and technology has brought unprecedented attention to this over the last few years. Suddenly the talent assessment industry is in the spotlight and AI is looking for a starring role. There is no doubt that AI is poised to provide incredible growth in predictive analytics and has already brought positive change. But when it comes to the science of talent assessment, it is critical that we proceed with caution when introducing AI-based technologies. Understanding why requires a closer look at the present situation when it comes to AI and talent assessment.

How AI Is Helping Assessments

Advances in AI for talent acquisition are playing an excellent supporting role in the ability of assessments to add value  Recruiting is currently undergoing an AI revolution, automating the heavy lifting that comes with sourcing and even engaging applicants. While it may seem cold to insert machines into the early dialogue between applicants and employers, it beats the black hole that most applicants traditionally experience.

Automated sourcing technologies act as a force multiplier for assessments. When the funnel is filled with targeted candidates, assessments provide an additional layer of insight that allows employers better odds of making a great hire.

Another positive impact of AI on talent assessment is that it is slowly helping change the way  assessment data is collected from candidates. No one likes taking tests, and in a world where candidate experience is king, engaging methods of collecting “predictor” data from applicants go a long way. The long forms with self-report radio buttons that the industry was built on aregoing the way of the gasoline-powered car. The talent assessment industry is rapidly adopting cool technologies for collecting data from applicants in more humane ways. In the coming years, we can expect to see the following technologies replacing typical self-report test questions as data-capture tools:

Despite the support of new technologies, the move from automating the routine tasks that support talent assessments to autonomous predictive modeling and decision making is still a huge leap. Here’s why.  

Data Science vs. Psychological Science

I/O psychologists and data scientists have overlapping skill sets and common goals, but view the data side of hiring decisions from a different frame of reference. I/O is a psychological science that is based on measuring “individual differences” regarding the traits and attributes possessed by people. Data science is more empirically driven, with views on measurement driven by patterns in data that often do not have a rational basis. A good example can be found in recent findings from Google’s own data finding that liking curley fries predicts job performance. Of course, Google is not actually acting on this information; in fact, Google has a highly evolved I/O psychology team that works collaboratively with its data scientists. The point is that some rational/theoretical thought must be used to understand the context and value of patterns found in data.  

Making Hiring Decisions Is Extremely Complex, and Machines Are Not Ready to Do It Alone

Humans are still better than machines at synthesizing complex information for decision making. The decisions made in the hiring process are a great example of this. Despite stunning advances in cognitive computing, machines simply are not ready to think in the same ways that humans are when it comes to this kind of decision making. Psychologists study “individual differences” in humans and use theory and data from this practice to build models for decision making. Machines are simply not yet sophisticated enough to be trained to understand individual differences at a level that can support predictions about future job performance. One good indicator of this is the fact that no one has yet to completely automate the scoring for assessment centers –– complex assessment exercises that collect a great deal of data that requires scoring of competencies across multiple tests and exercises.

AI Can Introduce Bias That Cannot Be Explained

AI-driven approaches to pattern recognition and decision making that are empirically driven can lead to the introduction of bias. Machines can be trained to avoid these biases, but the danger still exists. Because there is no way to actually explain what is happening in the black box, it is not possible to fully determine why the bias is occurring. In the context of talent assessment, a court of law may not be very understanding when the mechanics of a selection system that introduce bias cannot be fully explained. Such is the nature of a highly advanced black box that learns in ways we cannot fully explain. Thankfully, many assessment providers that are using AI are doing the right thing and calibrating their systems to remove bias.

Legal Precedent Is Not Yet Set

With autonomous cars, defining regulations is amongst the the biggest hurdle to mass adoption that will require structural and cultural change management on the grandest of scales.

When it comes to the use of AI in talent assessments, there are simply no precedents, and regulation is far off the radar screen. The standards for the legal compliance of assessment tools are laid out in the Uniform Guidelines on Employee Selection Procedures which were created in 1978 and certainly do not directly address the use of AI. The Uniform Guidelines do provide AI with a pass because they “OK” the use of most anything that can predict job performance as long as empirical evidence can be provided to support the relationship to job performance.  

Still, there are just no real rules of the road for AI-based assessments. Until there is a precedent set, this issue is simply too undefined to support any real understanding of the legal risk associated with using autonomous AI to make hiring decisions.

We are just not ready to use machines autonomously to evaluate applicant suitability, just as we are not ready to turn loose autonomous vehicles en masse. Eventually it is certain that both will evolve to a point of maturity that will require regulation and social change. And that time may be sooner than we think. Machine intelligence is starting to do some really astounding (and frightening things). For instance, AI-based chatbots have autonomously created their own language in order to meet the objectives they were trained to accomplish.

The good news is that advances in AI technology are most often seen as a collaborative effort between man and machine. This is definitely true in the realm of talent assessment, where the majority of those working with AI are taking an interdisciplinary approach in which I/O psychologists and data scientists are working closely together.  

The scary situations are those in which autonomous, machine-driven approaches to hiring are hailed as incredible feats of technology while ignoring the insight that psychological scientists bring to the party.   

While autonomous AI may be a great way to move you to work sometime in the near future (by handling your commute for you), it is far less likely to be the “vehicle” that hired you for the job you are commuting too. Only time will tell.   

 

Get articles like this
in your inbox
The longest running and most trusted source of information serving talent acquisition professionals.
Advertisement