The study employed two experiments with a total of 205 human observers who were asked to assess the veracity of expressions of pain in video clips of individuals, some of whom were being subjected to the cold presser test in which a hand is immersed in ice water to measure pain tolerance, and of others who were faking their painful expressions.
Mark G Frank, PhD, professor of communication, University at Buffalo, said human subjects could not discriminate real from faked expressions of pain more frequently than would be expected by chance, asserting that even after training, they were accurate only 55% of the time. The computer system, however, was accurate 85% of the time.
Bartlett noted that the computer system "managed to detect distinctive, dynamic features of facial expressions that people missed. Human observers just aren't very good at telling real from faked expressions of pain".
The researchers employed the computer expression recognition toolbox (CERT), an end-to-end system for fully automated facial-expression recognition that operates in real time. It was developed by Marian Bartlett, PhD, research professor, Institute for Neural Computation, University of California, San Diego; Gwen C Littlewort, PhD, co-director of the institute's Machine Perception Laboratory, Frank and others to assess the accuracy of machine versus human vision.
They found that machine vision was able to automatically distinguish deceptive facial signals from genuine facial signals by extracting information from spatiotemporal facial-expression signals that humans either cannot or do not extract.
The study has been published in the journal Current Biology.