Seemingly out of nowhere computers have learned to read human emotion from our faces. We can expect that any weaknesses this technology has now, in terms of accuracy or cost, will quickly erode as the technology advances. Marketing is interested in this technology to learn more about customers; HR is interested in using this tech for screening candidates.
In a nutshell, the computer does just what you or I would do: It looks at a person’s expression and categorizes it as happy, sad, surprised, fearful, angry, delighted or some other emotion. It can also track what a person is looking at. For example, if the customer shows delight when you show them your new product, but the scan shows they were actually looking at the donuts at the back of the room, then you’ll have to draw your conclusion.
When would HR want automated measures of emotion and attention? In the future, HR might be interested in how people are reacting to a training program;Hhow candidates are reacting to a job offer, or whether workers who are operating dangerous equipment look tired. At the moment, the “ready now” application is in recruiting.
The start-up Knockri assesses foundational competencies such as empathy, collaboration, or growth mindset based on how a candidate responds to a behavioral question. A candidate’s facial expressions, as well as the tone of voice and an automated analysis of the content of their answer, provide clues as to how high a candidate scores on the targeted competency. It adds not just automation to the screening process, it can actually reduce human bias.
There are many other vendors working in this space, such as the Emotion Research Lab based in Spain and the well-known recruiting tech firm HireVue.