Advertisement

One-Test Wonders: “Single Assessment” Solutions

Jun 18, 2003

Over the past few years, we’ve spent a lot of time assisting many different types of organizations select and deploy a variety of staffing assessment solutions. This has given us a chance to get an in-depth look at the products and services offered by most of the major players in the staffing assessment marketplace. Once you get past the glossy marketing materials, the “staffing solutions” offered by many assessment vendors basically come down to a single assessment tool or test. These tests are usually designed to measure a range of personality characteristics, motives, values, and abilities associated with different aspects of job performance. We refer to these broad, multi-use tests as “single assessment” (SA) solutions. SA vendors usually claim that their one basic assessment tool can predict performance for a very broad range of jobs. They may also offer several kinds of reports for interpreting data collected by their tool to support things such as applicant pre-screening, candidate interviews, and employee development. These reports are often marketed in a way that might lead you to believe that the vendor is offering access to several different assessment solutions. But when you dig deeper, you find that all of the reports are based on data from the same basic tool. Like most staffing assessment solutions, SA solutions are neither uniformly good nor bad. It depends on the unique staffing needs of your organization, as well as the particular nature different SA tools. On the other hand, because they rely on a common one-size-fits-all methodology, SA solutions tend to share some common features. The following is a brief summary of issues to consider when investigating the use of SA solutions. Most of the issues reflect both a strength and weakness of the SA approach:

    • Off the shelf. By definition, SAs are an “off-the-shelf” solution. The vendor is selling a tool that has been designed to be used with little or no customization for a wide range of jobs. The advantages is that SAs can usually be deployed quickly, and with relatively low costs compared to more customized solutions. The disadvantage is they are likely to be less efficient and effective than solutions that are configured for a specific job and organization. Much of the assessment content in an SA may focus on candidate characteristics that are less relevant for the particular position you are staffing, thus reducing efficiency.

  • Multi-use. One of the nice things about many SAs is that they are designed to be used in a variety of different ways. For example, many SAs provide reports to support both candidate selection and employee development. This is useful for fully leveraging the value of data collected during the staffing process and for fostering alignment between staffing and development practices. The disadvantage is that many SAs were primarily designed to support one function (e.g. selection), and are not particularly well-suited for other uses. Although they can provide support for other uses, these alternative applications are “secondary” to their primary purpose and it shows.
  • One size fits all. Most SAs are designed to predict relatively “typical” job behaviors so that they can be used with as many jobs as possible. As a result, the more unique the job, the less effective an SA is likely to be for predicting its performance. This is perhaps one of the main limitations of SA solutions. Although they may predict performance for a lot of jobs to some degree, they may not predict performance in any one job very well. Many vendors attempt to address this by providing some ability to customize the scoring or reports used by their SA. However, there are often serious limitations to these customization efforts, since they typically do not involve changing the actual content of the assessment tool itself. It’s like trying to tailor an off-the-rack suit: when you cannot change the basic material and design of the suit, there are serious limits to what you can accomplish by changing the hem line.
  • Are the old ways the best? Many of the SAs on the market have been around for years, and have had relatively few substantial revisions to the basic design of their assessment tools. We have seen SA content that has been in use for over 40 years. On one hand, this content has well-demonstrated validity ó in this sense one might argue, “if it ain’t broke, don’t fix it.” On the other hand, there have been substantial advances in the design of staffing assessments over the past 30 years. Just because something isn’t broken, that does not mean that it’s working as well as it might.
  • Over interpretation. While SAs can provide valuable information for guiding staffing decisions, there is a risk of managers over-interpreting the test results. This risk is increased by the tendency of some vendors to take an almost evangelical stance when talking about the accuracy of their SAs for gaining insight into people’s underlying capabilities. Providing SAs that claim to provide a comprehensive view of all candidates regardless of the job can lead to hiring managers disengaging from the hiring process and instead letting the assessment tool do the thinking for them. The result is reduced effectiveness, felt responsibility, and accountability for making good hiring decisions.
  • Accepting the ideology. Several SAs on the market are based on broad, underlying theories of human behavior. Deploying these SAs may include educating hiring managers on these theories as “the way” to think about employee behavior. We have seen cases where the use of an SA has literally changed how hiring managers think and talk about people. Instead of using adjectives to describe employees, they talk about them in terms of test scores. Although it does feel a bit like “Brave New World,” this is not necessarily a bad thing ó assuming the theory behind the SA is sound. However, many of the theories that form the foundation of SAs have not been widely tested or accepted by those doing high quality employee selection research. By using these SAs you may be asking your company to adopt a way of thinking about people that is really more a reflection of one test developer’s beliefs as opposed to well-tested principles of human behavior.
  • Consulting support. Many SA vendors are relatively small companies. As a result, SAs are often sold using independent resellers in different geographic regions. The advantage of local resellers is that they can often provide high levels of personalized support at a fraction of the consulting costs that are charged by larger assessment companies. The disadvantage is that local resellers are often “test salespeople” who lack the expertise and knowledge that consultants from larger assessment companies usually have. In several cases, we have found ourselves talking to SA resellers who know less about how their test actually works than we do.

Few vendors would actually label their product as an SA solution, even though we would argue that being an SA vendor is neither a good or bad thing. When you start looking for SA vendors it does become readily apparent which assessment companies fall into this category. In our “Rocket-Hire Buyer?s Guide to Web-Based Screening & Staffing Assessment Systems” (http://www.rocket-hire.com/buyersguide/index.html) over 10 of the vendors we included could be placed into the SA category. SAs can be effective if you are interested in getting something up and running quickly with relatively little upfront costs. They are likely to be particularly attractive to smaller organizations with a wide variety of jobs but relatively low hiring volumes. However, use of SAs should be approached with a healthy degree of skepticism. Many SA vendors have a tendency to oversell the value of their tools, and care should be taken to make sure the SA does what it is marketed as doing. Don’t assume that just because an SA produces a report listing a certain job title that it actually predicts performance for that kind of job. As with all assessment solutions, ask the vendor for empirical studies demonstrating the validity of their tools. Also ask them to describe the limitations and risks of their SA. And when listening to their answer, remember the saying, “Something designed to do everything often does nothing well.”

Get articles like this
in your inbox
Subscribe to our mailing list and get interesting articles about talent acquisition emailed weekly!
Advertisement