Hiring Ralph Northam: How Much of a Candidate’s Past Is Relevant?

The story of Ralph Northam — the Governor of Virginia — is familiar to most. News of a picture in his yearbook from 1984 resulted in calls for his resignation. The picture, which may or may not include him, showed two people, one dressed in a KKK costume, the other in blackface.

The reasoning used by those calling for the governor to resign is that the picture demonstrates racism and therefore renders him unfit for the job he holds. Others argue that a picture taken 35 years ago has no relevance today. Mr. Northam is a public official and chief executive of his state, so perhaps standards may be different for someone in such a prominent role. But the incident does raise some issues about how much bearing a person’s past should have on their suitability for a job in the present.

The Legal Angle

Assuming that Mr. Northam was “hired” by the state of Virginia, with the voters selecting him as the most qualified candidate, could some other employer have rejected him had they known about the yearbook picture at the time the hiring decision was being made? U.S. law places few restrictions on why a person may be denied a job. An employer can reject a candidate for any reason, other than an illegal reason such as discriminating on the basis of protected categories such as gender, race, disability, etc. or for participation in certain types of activities. No justification is required if a candidate is rejected for any other reason unless the decision is subject to a legal challenge.

The same is true in most other countries as well. In Europe the GDPR only requires that job applicants be given access to interview notes and any data collected on them, such as from an assessment.

Is the Past Prologue?

Does the past set the stage for what a person may do today? Evaluating a candidate’s behavior that is not directly job related is difficult, at best, in determining their suitability for a job. Is a picture from 30 years ago just as relevant as a tweet from yesterday, if both contain material deemed objectionable? Even when the behavior was criminal, should a candidate be rejected? Is someone who was convicted of a DUI 20 years ago completely unsuited for most jobs because they have a felony on their record? In one case a person was convicted of a felony because he mistakenly diverted some overflowing sewage into a river and in the process violated the Clean Water Act.

Research on the relevance of past behavior to predicting current behavior shows that the conventional wisdom may not always be correct. The largest study of its kind analyzed the behavior of 1.3 million ex-offender and non-offender enlistees in the U.S. military. Incredibly, the study found that ex-felons are promoted more quickly and to higher ranks than other enlistees.Those with a felony background showed no difference in attrition rates due to poor performance compared to those without criminal records.

Article Continues Below

Can AI Help?

Instead of leaving the evaluation of candidates’ behavior to the vagaries of individual decision making, there are some technology solutions becoming available that claim to evaluate candidates by using data from their personal lives. One example is Predictim, that scans Facebook, Twitter, and Instagram posts to predict a person’s suitability for being a babysitter. It assigns risk scores for behaviors such as drug abuse, bullying, and harassment.

But AI-driven products are no panacea either. The algorithms are often a black box where the basis for results is not well understood, even by the data scientists that created them. That creates a risk for systemic bias because of how the AI was trained. A decision to outsource individual judgement to an AI product should be weighed against the likely risks and confidence in the AI developers of being able to explain how it works.

Would You Hire Ralph Northam?

Personal behavior that falls outside clearly definable parameters is not easily weighed. A pattern of behavior, such as recent tweets or social media posts that include racist or misogynistic comments, are likely more relevant than something that happened decades ago when there’s no evidence that the behavior has been repeated. When looking at a picture or reading a post, is it possible to be certain that it is of the candidate being evaluated? Facial recognition software can be error-prone, so the possibility of a false positive cannot be eliminated.

Given such ambiguity, the outcome for candidates often results from personal biases of the evaluators more than any specific principles or rules. There are no easy answers here.

Raghav Singh, director of analytics at Korn Ferry Futurestep, has developed and launched multiple software products and held leadership positions at several major recruiting technology vendors. His career has included work as a consultant on enterprise HR systems and as a recruiting and HRIT leader at several Fortune 500 companies. Opinions expressed here are his own.

Topics