facial recognition system

Majority of Facial Recognition Systems Are Less Accurate For People of Color, Federal Study Finds

Native Americans had the highest rates of false positives, while African-American women were most likely to be misidentified in a law enforcement database.

A new study, released by the National Institute of Standards and Technology (NIST) on Thursday, finds that a majority of commercial facial recognition systems are less accurate when identifying people of color, particularly African-Americans, Native Americans and Asians.

The federal agency conducted tests on 189 facial recognition algorithms from 99 developers, including systems from Microsoft, Chinese intelligence company Megvii and more. Systems from Amazon, Apple, Facebook and Google were not tested because none of the companies submitted algorithms to the study, according to The New York Times.

Algorithms developed in the U.S. showed high rates of false positives relative to images of white people, with Native Americans having the highest rates of false positives.

“While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” Patrick Grother, a NIST computer scientist and the report’s primary author, said in a statement. “While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.”

Notably, the study found that algorithms developed in Asia did not demonstrate the same “dramatic” difference in false positives between Asian and Caucasian faces. Grother said that although the study does not explore the causes behind the false positives, the issue could be that American algorithms are using data sets with primarily Caucasian faces to train their facial recognition systems, making it difficult for those algorithms to accurately identify people of color.

“These results are an encouraging sign that more diverse training data may produce more equitable outcomes, should it be possible for developers to use such data,” Grother said.

On a FBI database of 1.6 million domestic mugshots, the report found higher rates of false positives for African-American women. The accuracy issue for law enforcement particularly concerns civil liberties groups who argue that the facial recognition algorithms, still in their infancy, could lead to false accusations, arrests and potential imprisonment.

“One false match can lead to missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests or worse,” Jay Stanley, a policy analyst at the American Civil Liberties Union, said in a statement. “Government agencies including the F.B.I., Customs and Border Protection and local law enforcement must immediately halt the deployment of this dystopian technology.”

The study was published as towns and states across the country consider issuing moratoriums on government use of facial recognition. California will implement a three-year moratorium starting in 2020, and towns in Massachusetts have banned law enforcement use of the systems.

Meanwhile, U.S. Customs and Border Protection was pressured to drop plans to expand mandatory facial recognition scans to Americans entering and exiting the country. The practice is already standard for foreign travelers coming into and leaving the U.S.

  • Ahead of Current Events Ahead of Current Events

    In this episode, Ralph C. Jensen chats with Dana Barnes, president of global government at Dataminr. We talk about the evolution of Dataminr and how data software benefits business and personnel alike. Dataminr delivers the earliest warnings on high impact events and critical information far in advance of other sources, enabling faster response, more effective risk mitigation for both public and private sector organizations. Barnes recites Dataminr history and how their platform works. With so much emphasis on cybersecurity, Barnes goes into detail about his cybersecurity background and the measures Dataminr takes to ensure safe and secure implementation.

Digital Edition

  • Security Today Magazine - November December 2022

    November / December 2022

    Featuring:

    • Key Tech Trend
    • Is Your Access Control System Cyber Secure?
    • Constantly Evolving
    • The Talent Shortage
    • Looking Forward to 2023

    View This Issue

  • Environmental Protection
  • Occupational Health & Safety
  • Spaces4Learning
  • Campus Security & Life Safety