Automated Computer System Identifies Liars 82.5 Percent of the Time

Inspired by the work of psychologists who study the human face for clues that someone is telling a high-stakes lie, University of Buffalo computer scientists are exploring whether machines can also read the visual cues that give away deceit.

Results so far are promising: In a study of 40 videotaped conversations, an automated system that analyzed eye movements correctly identified whether interview subjects were lying or telling the truth 82.5 percent of the time.

That’s a better accuracy rate than expert human interrogators typically achieve in lie-detection judgment experiments, said Ifeoma Nwogu, a research assistant professor at UB’s Center for Unified Biometrics and Sensors (CUBS) who helped develop the system. In published results, even experienced interrogators average closer to 65 percent, Nwogu said.

“What we wanted to understand was whether there are signal changes emitted by people when they are lying, and can machines detect them? The answer was yes, and yes,” said Nwogu.

The research was peer-reviewed, published and presented as part of the 2011 IEEE Conference on Automatic Face and Gesture Recognition.

Nwogu’s colleagues on the study included CUBS scientists Nisha Bhaskaran and Venu Govindaraju, and UB communication professor Mark G. Frank, a behavioral scientist whose primary area of research has been facial expressions and deception.

In the past, Frank’s attempts to automate deceit detection have used systems that analyze changes in body heat or examine a slew of involuntary facial expressions.

The automated UB system tracked a different trait -- eye movement. The system employed a statistical technique to model how people moved their eyes in two distinct situations: during regular conversation, and while fielding a question designed to prompt a lie.

People whose pattern of eye movements changed between the first and second scenario were assumed to be lying, while those who maintained consistent eye movement were assumed to be telling the truth. In other words, when the critical question was asked, a strong deviation from normal eye movement patterns suggested a lie.

Previous experiments in which human judges coded facial movements found documentable differences in eye contact at times when subjects told a high-stakes lie.

What Nwogu and fellow computer scientists did was create an automated system that could verify and improve upon information used by human coders to successfully classify liars and truth tellers. The next step will be to expand the number of subjects studied and develop automated systems that analyze body language in addition to eye contact.

Nwogu said that while the sample size was small, the findings are exciting.

They suggest that computers may be able to learn enough about a person’s behavior in a short time to assist with a task that challenges even experienced interrogators. The videos used in the study showed people with various skin colors, head poses, lighting and obstructions such as glasses.

This does not mean machines are ready to replace human questioners, however -- only that computers can be a helpful tool in identifying liars, Nwogu said.

She noted that the technology is not foolproof: A very small percentage of subjects studied were excellent liars, maintaining their usual eye movement patterns as they lied. Also, the nature of an interrogation and interrogators’ expertise can influence the effectiveness of the lie-detection method.

The videos used in the study were culled from a set of 132 that Frank recorded during a previous experiment.

In Frank’s original study, 132 interview subjects were given the option to “steal” a check made out to a political party or cause they strongly opposed.

Subjects who took the check but lied about it succes

sfully to a retired law enforcement interrogator received rewards for themselves and a group they supported; Subjects caught lying incurred a penalty: they and their group received no money, but the group they despised did. Subjects who did not steal the check faced similar punishment if judged lying, but received a smaller sum for being judged truthful.

The interrogators opened each interview by posing basic, everyday questions. Following this mundane conversation, the interrogators asked about the check. At this critical point, the monetary rewards and penalties increased the stakes of lying, creating an incentive to deceive and do it well.

In their study on automated deceit detection, Nwogu and her colleagues selected 40 videotaped interrogations.

They used the mundane beginning of each to establish what normal, baseline eye movement looked like for each subject, focusing on the rate of blinking and the frequency with which people shifted their direction of gaze.

The scientists then used their automated system to compare each subject’s baseline eye movements with eye movements during the critical section of each interrogation -- the point at which interrogators stopped asking everyday questions and began inquiring about the check.

If the machine detected unusual variations from baseline eye movements at this time, the researchers predicted the subject was lying.

Featured

  • Why Communication is Key in an Emergency

    During an emergency, communication with the outside world can be a critical component when it comes to response time and saving lives. Emergency communications typically consist of alerts and warnings; directives about evacuating the premises; information about response status, and other matters that can impact response and recovery. Read Now

  • Trust But Verify

    Today’s world is built on software—whether it is third-party applications, open-source libraries, in-house developed tools, operating systems, containers or firmware. Organizations worldwide depend on these diverse software components to power their operations, connect with customers, and drive innovation. However, this reliance on software comes with hidden dangers: the blind trust placed in these software products. Many companies assume that the software they purchase, and use is secure and free from vulnerabilities, but recent high-profile software supply chain breaches have proven otherwise. The reality is that every piece of software, no matter how reputable the source, increases the organization’s attack surface and poses new risks. Read Now

  • Impact on Digital Transformation

    A 2023 Statista report projects that by 2030 there will be 30 billion Internet of Things (IoT) devices in use. That is three times as many as there were in 2020. The numbers continue to grow because connecting sensors and systems, especially across a business, promises big efficiency gains and new insights. As such, the IoT and IIoT (Industrial Internet of Things) have become a launching pad for digital transformation -- not only for individual organizations but for entire industries. Read Now

  • Optimizing Security and Business Performance with Clarity and Control

    In recent years, the security sector has experienced a significant influx of innovative technologies that have fundamentally transformed how organizations design, implement, and oversee their security programs. The widespread adoption of cloud-based infrastructure, edge processing, and AI or machine learning (ML) driven analytics has brought about revolutionary changes in applications such as access control, video surveillance and emerging areas like threat detection and drone identification. Read Now

Featured Cybersecurity

Webinars

New Products

  • QCS7230 System-on-Chip (SoC)

    QCS7230 System-on-Chip (SoC)

    The latest Qualcomm® Vision Intelligence Platform offers next-generation smart camera IoT solutions to improve safety and security across enterprises, cities and spaces. The Vision Intelligence Platform was expanded in March 2022 with the introduction of the QCS7230 System-on-Chip (SoC), which delivers superior artificial intelligence (AI) inferencing at the edge. 3

  • HD2055 Modular Barricade

    Delta Scientific’s electric HD2055 modular shallow foundation barricade is tested to ASTM M50/P1 with negative penetration from the vehicle upon impact. With a shallow foundation of only 24 inches, the HD2055 can be installed without worrying about buried power lines and other below grade obstructions. The modular make-up of the barrier also allows you to cover wider roadways by adding additional modules to the system. The HD2055 boasts an Emergency Fast Operation of 1.5 seconds giving the guard ample time to deploy under a high threat situation. 3

  • EasyGate SPT and SPD

    EasyGate SPT SPD

    Security solutions do not have to be ordinary, let alone unattractive. Having renewed their best-selling speed gates, Cominfo has once again demonstrated their Art of Security philosophy in practice — and confirmed their position as an industry-leading manufacturers of premium speed gates and turnstiles. 3