Automated Computer System Identifies Liars 82.5 Percent of the Time

Inspired by the work of psychologists who study the human face for clues that someone is telling a high-stakes lie, University of Buffalo computer scientists are exploring whether machines can also read the visual cues that give away deceit.

Results so far are promising: In a study of 40 videotaped conversations, an automated system that analyzed eye movements correctly identified whether interview subjects were lying or telling the truth 82.5 percent of the time.

That’s a better accuracy rate than expert human interrogators typically achieve in lie-detection judgment experiments, said Ifeoma Nwogu, a research assistant professor at UB’s Center for Unified Biometrics and Sensors (CUBS) who helped develop the system. In published results, even experienced interrogators average closer to 65 percent, Nwogu said.

“What we wanted to understand was whether there are signal changes emitted by people when they are lying, and can machines detect them? The answer was yes, and yes,” said Nwogu.

The research was peer-reviewed, published and presented as part of the 2011 IEEE Conference on Automatic Face and Gesture Recognition.

Nwogu’s colleagues on the study included CUBS scientists Nisha Bhaskaran and Venu Govindaraju, and UB communication professor Mark G. Frank, a behavioral scientist whose primary area of research has been facial expressions and deception.

In the past, Frank’s attempts to automate deceit detection have used systems that analyze changes in body heat or examine a slew of involuntary facial expressions.

The automated UB system tracked a different trait -- eye movement. The system employed a statistical technique to model how people moved their eyes in two distinct situations: during regular conversation, and while fielding a question designed to prompt a lie.

People whose pattern of eye movements changed between the first and second scenario were assumed to be lying, while those who maintained consistent eye movement were assumed to be telling the truth. In other words, when the critical question was asked, a strong deviation from normal eye movement patterns suggested a lie.

Previous experiments in which human judges coded facial movements found documentable differences in eye contact at times when subjects told a high-stakes lie.

What Nwogu and fellow computer scientists did was create an automated system that could verify and improve upon information used by human coders to successfully classify liars and truth tellers. The next step will be to expand the number of subjects studied and develop automated systems that analyze body language in addition to eye contact.

Nwogu said that while the sample size was small, the findings are exciting.

They suggest that computers may be able to learn enough about a person’s behavior in a short time to assist with a task that challenges even experienced interrogators. The videos used in the study showed people with various skin colors, head poses, lighting and obstructions such as glasses.

This does not mean machines are ready to replace human questioners, however -- only that computers can be a helpful tool in identifying liars, Nwogu said.

She noted that the technology is not foolproof: A very small percentage of subjects studied were excellent liars, maintaining their usual eye movement patterns as they lied. Also, the nature of an interrogation and interrogators’ expertise can influence the effectiveness of the lie-detection method.

The videos used in the study were culled from a set of 132 that Frank recorded during a previous experiment.

In Frank’s original study, 132 interview subjects were given the option to “steal” a check made out to a political party or cause they strongly opposed.

Subjects who took the check but lied about it succes

sfully to a retired law enforcement interrogator received rewards for themselves and a group they supported; Subjects caught lying incurred a penalty: they and their group received no money, but the group they despised did. Subjects who did not steal the check faced similar punishment if judged lying, but received a smaller sum for being judged truthful.

The interrogators opened each interview by posing basic, everyday questions. Following this mundane conversation, the interrogators asked about the check. At this critical point, the monetary rewards and penalties increased the stakes of lying, creating an incentive to deceive and do it well.

In their study on automated deceit detection, Nwogu and her colleagues selected 40 videotaped interrogations.

They used the mundane beginning of each to establish what normal, baseline eye movement looked like for each subject, focusing on the rate of blinking and the frequency with which people shifted their direction of gaze.

The scientists then used their automated system to compare each subject’s baseline eye movements with eye movements during the critical section of each interrogation -- the point at which interrogators stopped asking everyday questions and began inquiring about the check.

If the machine detected unusual variations from baseline eye movements at this time, the researchers predicted the subject was lying.

Featured

  • Cyber Overconfidence Is Leaving Your Organization Vulnerable

    The increased sophistication of cyber threats pumped by the relentless use of AI and machine learning brings forth record-breaking statistics. Cyberattacks grew 44% YoY in 2024, with a weekly average of 1,673 cyberattacks per organization. While organizations up their security game to help thwart these attacks, a critical question remains: Can employees identify a threat when they come across one? A Confidence Gap survey reveals that 86% of employees feel confident in their ability to identify phishing attempts. But things are not as rosy as they appear; the more significant part of the report finds this confidence misplaced. Read Now

  • Mission 500 Debuts Refreshed Identity Ahead of Security 5K/2K at ISC West

    Mission 500, the security industry’s nonprofit charity dedicated to supporting children in need across the US, Canada, and Puerto Rico, has unveiled a refreshed brand identity ahead of ISC West. The charity’s new look includes a modernized logo with refined messaging to reinforce Mission 500’s nearly decade-long commitment to serving the needs of children and families in crisis. Read Now

    • Industry Events
  • Meeting Modern Demands

    Door hardware and access control continue to be at the forefront of innovation within the security industry, continuously evolving to meet the dynamic needs of commercial spaces. Read Now

  • Leveraging IoT and Open Platform VMS for a Connected Future

    The evolution of urban environments is being reshaped by the convergence of Internet of Things (IoT) technology and open platform VMS. As cities worldwide grapple with growing populations and increasing operational complexities, these integrated technologies are emerging as powerful tools for creating more livable, efficient, and secure urban spaces. Read Now

New Products

  • A8V MIND

    A8V MIND

    Hexagon’s Geosystems presents a portable version of its Accur8vision detection system. A rugged all-in-one solution, the A8V MIND (Mobile Intrusion Detection) is designed to provide flexible protection of critical outdoor infrastructure and objects. Hexagon’s Accur8vision is a volumetric detection system that employs LiDAR technology to safeguard entire areas. Whenever it detects movement in a specified zone, it automatically differentiates a threat from a nonthreat, and immediately notifies security staff if necessary. Person detection is carried out within a radius of 80 meters from this device. Connected remotely via a portable computer device, it enables remote surveillance and does not depend on security staff patrolling the area.

  • Camden CM-221 Series Switches

    Camden CM-221 Series Switches

    Camden Door Controls is pleased to announce that, in response to soaring customer demand, it has expanded its range of ValueWave™ no-touch switches to include a narrow (slimline) version with manual override. This override button is designed to provide additional assurance that the request to exit switch will open a door, even if the no-touch sensor fails to operate. This new slimline switch also features a heavy gauge stainless steel faceplate, a red/green illuminated light ring, and is IP65 rated, making it ideal for indoor or outdoor use as part of an automatic door or access control system. ValueWave™ no-touch switches are designed for easy installation and trouble-free service in high traffic applications. In addition to this narrow version, the CM-221 & CM-222 Series switches are available in a range of other models with single and double gang heavy-gauge stainless steel faceplates and include illuminated light rings.

  • ResponderLink

    ResponderLink

    Shooter Detection Systems (SDS), an Alarm.com company and a global leader in gunshot detection solutions, has introduced ResponderLink, a groundbreaking new 911 notification service for gunshot events. ResponderLink completes the circle from detection to 911 notification to first responder awareness, giving law enforcement enhanced situational intelligence they urgently need to save lives. Integrating SDS’s proven gunshot detection system with Noonlight’s SendPolice platform, ResponderLink is the first solution to automatically deliver real-time gunshot detection data to 911 call centers and first responders. When shots are detected, the 911 dispatching center, also known as the Public Safety Answering Point or PSAP, is contacted based on the gunfire location, enabling faster initiation of life-saving emergency protocols.