Many of us have read the book Blink, by Malcolm Gladwell, in which he postulates that chance and “gut feel” may play a bigger role in our lives than we imagine. Another book, older and more rigorously researched, entitled Fooled by Randomness, by Nassim Nicholas Taleb, also takes a similar position. These books make me rethink my own belief in our ability to consistently interview and select the best. They bring to mind a recruiter who once worked for me. He thought that the time we spend interviewing and screening candidates for specific jobs was a waste of time. He insisted that all candidates who had met a minimum number of basic criteria for a job would potentially be able to perform that job equally well. The only remaining need was to determine how well the candidate fit with the hiring manager and, to a lesser degree, with the organization. He felt that it would be more cost effective to make a lot of hires quickly and then let their on-the-job performance determine who should be kept and who should not. His idea was to give the candidate a set of questions, have a short interview or two, and make a decision.
In the simplest terms, he felt it was better to let randomness play a large role in selection, that it was better to have a loose, easy-in/easy-out hiring practice than a much tighter and thorough upfront screening process. At the time I was appalled at the thought, and said so. I felt recruiters had a responsibility to ensure quality and to make sure that the very best were being presented to hiring managers. I also thought that with tests and behavioral interviews we could surely choose the best people. Now I am not so sure. When I think back to the middle of the 20th century, most jobs were filled fairly quickly. There were few employees who had the specific title of recruiter. Those who did were often just clerks who made sure paperwork was properly completed. Most jobs were filled after a brief interview with a hiring manager who made his decision based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics. Most jobs could be learned quickly and it was quite easy to see whether a job was being done well or not. It was easy to get rid of poor performers and plenty got fired right away. However, a lot didn’t.
There were many things wrong with this approach, but the most obvious was that it blatantly discriminated against anyone who did not fit the stereotype of the hiring manager. Greater awareness of discrimination and new legislation drove the growth of the recruiting profession and removed much of the injustice this system perpetuated. But the old system did have one virtue: It was simple and was built on a belief that attitude and performance were what really counted. Many engineers, doctors, and lawyer were trained in what amounts to an apprentice system right up until World War II. Formal skills training only gradually gained acceptance after the war, when thousands of GIs went back to school on the GI bill.
As we moved into the 1950s and 1960s, these more causal hiring practices were replaced by the development of job requirements: things like minimum levels of education or years of experience before a person would be considered for a position. This was seen as fairer and served as a screen against hundreds of people potentially applying for the same job. The problem with this approach is that the defined requirements were almost never connected to actual performance. They only seemed fairer because they eliminated or reduced screening out because of race or sex. However, we have learned over the past 40 years that people who qualify for jobs based on their education or experience alone are not necessarily good performers. We now know that simply selecting people by generic measures like education and experience don’t work very well and discriminate against those with the real skills who do not have the required credentials. Job requirements today are changing so fast that we can’t keep up. During the dot-com boom we saw how quickly new skills became needed and how weak our selection systems were. We just didn’t know what competencies or skills we should look for and we didn’t have time to find out. Managers were, and still are, confused as to what they want in a candidate, and there is a tendency to go back to selection criteria that smack a bit of the past. Referral programs are a bit like family connections, and attitude is now more important than ever in selection. The need for HTML programmers grew exponentially for months in 2000, as did the need for network administrators and other kinds of programmers. Most recruiters didn’t even begin to understand what they were recruiting for and clearly no meaningful generic educational guidelines could be established because few schools offered the education.
On the other side, there were almost no experienced people available either. Managers were frustrated at the recruiters and vice versa. Unfortunately, this situation will be a characteristic of the emerging century as new technologies replace old ones and entirely new skills are needed. It will be very difficult to use traditional techniques or measures or even to figure out the precise competencies and skills that are needed for a job. So, what will we do? Three rules seem to be forming around defining new positions as well as for redefining the more traditional ones.
- Use technology to profile jobs quickly. Several vendors now offer software that allows you or a hiring manager to describe a job, pick out key competencies and skills, and draw a profile of what is needed within a matter of a few hours. Older methods might take months to produce profiles, although those profiles were probably more accurate and complete than these. The issue is how permanent are the jobs and functions you hiring for. For example, the duties of programmers change constantly. C programmers have had to evolve through C to C++ to C# in the course of just a few years. How much do you want to invest in perfection? What becomes more important, perhaps, are things like general programming speed, the skill to learn quickly and accept rapid change, rather than the specific language they know.
- Be competency flexible and teach hiring managers that development is part of recruiting. The 80/20 rules applies more than ever as new jobs and duties emerge and recruiters are forced to find ways to define them and select candidates using them. Managers will be forced to accept that they will not be able to find candidates with 100% of what they want. Managers and HR will learn that development is a core function of the firm in the 21st century. IBM put in place a development-centered program in the 1960s when they began hiring and developing new college grads because there were no people with the skills they needed. Remember, there were no programmers when the first mainframes were produced, so IBM had to develop them. Many companies have used development as a strategic edge: When you have people with skills and others don’t, you tend to win. Finding and developing current employees who have some, but perhaps not all, of the skills needed for a job will also become more common.
- Have robust performance management systems in place. By hiring people using broad competency descriptions, as I am advocating, you may hire some poor performers. And that’s okay. What is not okay is ignoring that and allowing them to stay in your organization. A good performance management system, based on whether people achieve realistic goals and meet the requirements of their position, is essential to success.
The hallmark of the best 21st-century organizations will be their approach to defining the people they need. Traditional measures of education, experience, attitude and cultural fit may play a small part, but what will be significantly different is a quick, flexible approach to defining competencies combined with efficient performance management systems. This will result in more fluid and less well-defined jobs, but broader and more multi-skilled employees.