Computer scientists at the University of Buffalo (UB; Buffalo, NY, USA) are exploring whether machines can read visual cues that give away deceit.
In a study of 40 videotaped conversations, an automated system that analyzed eye movements correctly identified whether interview subjects were lying or telling the truth 82.5 percent of the time, a better accuracy rate than expert human interrogators typically achieve in lie-detection judgment experiments.
The eye tracking system employed a statistical technique to model how people moved their eyes in two distinct situations: during regular conversation, and while fielding a question designed to prompt a lie.
People whose pattern of eye movements changed between the first and second scenario were assumed to be lying, while those who maintained consistent eye movement were assumed to be telling the truth. In other words, when a critical question was asked, a strong deviation from normal eye movement patterns suggested a lie.
Ifeoma Nwogu, a research assistant professor at UB's Center for Unified Biometrics and Sensors (CUBS) helped develop the system. She noted that the technology is not foolproof: A very small percentage of subjects studied were excellent liars, maintaining their usual eye movement patterns as they lied.
The research was published and presented as part of the 2011 IEEE Conference on Automatic Face and Gesture Recognition.
-- by Dave Wilson, Senior Editor, Vision Systems Design