facial recognition system

Majority of Facial Recognition Systems Are Less Accurate For People of Color, Federal Study Finds

Native Americans had the highest rates of false positives, while African-American women were most likely to be misidentified in a law enforcement database.

A new study, released by the National Institute of Standards and Technology (NIST) on Thursday, finds that a majority of commercial facial recognition systems are less accurate when identifying people of color, particularly African-Americans, Native Americans and Asians.

The federal agency conducted tests on 189 facial recognition algorithms from 99 developers, including systems from Microsoft, Chinese intelligence company Megvii and more. Systems from Amazon, Apple, Facebook and Google were not tested because none of the companies submitted algorithms to the study, according to The New York Times.

Algorithms developed in the U.S. showed high rates of false positives relative to images of white people, with Native Americans having the highest rates of false positives.

“While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” Patrick Grother, a NIST computer scientist and the report’s primary author, said in a statement. “While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.”

Notably, the study found that algorithms developed in Asia did not demonstrate the same “dramatic” difference in false positives between Asian and Caucasian faces. Grother said that although the study does not explore the causes behind the false positives, the issue could be that American algorithms are using data sets with primarily Caucasian faces to train their facial recognition systems, making it difficult for those algorithms to accurately identify people of color.

“These results are an encouraging sign that more diverse training data may produce more equitable outcomes, should it be possible for developers to use such data,” Grother said.

On a FBI database of 1.6 million domestic mugshots, the report found higher rates of false positives for African-American women. The accuracy issue for law enforcement particularly concerns civil liberties groups who argue that the facial recognition algorithms, still in their infancy, could lead to false accusations, arrests and potential imprisonment.

“One false match can lead to missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests or worse,” Jay Stanley, a policy analyst at the American Civil Liberties Union, said in a statement. “Government agencies including the F.B.I., Customs and Border Protection and local law enforcement must immediately halt the deployment of this dystopian technology.”

The study was published as towns and states across the country consider issuing moratoriums on government use of facial recognition. California will implement a three-year moratorium starting in 2020, and towns in Massachusetts have banned law enforcement use of the systems.

Meanwhile, U.S. Customs and Border Protection was pressured to drop plans to expand mandatory facial recognition scans to Americans entering and exiting the country. The practice is already standard for foreign travelers coming into and leaving the U.S.

About the Author

Haley Samsel is an Associate Content Editor for the Infrastructure Solutions Group at 1105 Media.

Featured

  • The Next Generation

    Video security technology has reached an inflection point. With advancements in cloud infrastructure and internet bandwidth, hybrid cloud solutions can now deliver new capabilities and business opportunities for security professionals and their customers. Read Now

  • Help Your Customer Protect Themselves

    In the world of IT, insider threats are on a steep upward trajectory. The cost of these threats - including negligent and malicious employees that may steal authorized users’ credentials, rose from $8.3 million in 2018 to $16.2 million in 2023. Insider threats towards physical infrastructures often bleed into the realm of cybersecurity; for instance, consider an unauthorized user breaching a physical data center and plugging in a laptop to download and steal sensitive digital information. Read Now

  • Enhanced Situation Awareness

    Did someone break into the building? Maybe it is just an employee pulling an all-nighter. Or is it an actual perpetrator? Audio analytics, available in many AI-enabled cameras, can add context to what operators see on the screen, helping them validate assumptions. If a glass-break detection alert is received moments before seeing a person on camera, the added situational awareness makes the event more actionable. Read Now

  • Transformative Advances

    Over the past decade, machine learning has enabled transformative advances in physical security technology. We have seen some amazing progress in using machine learning algorithms to train computers to assess and improve computational processes. Although such tools are helpful for security and operations, machines are still far from being capable of thinking or acting like humans. They do, however, offer unique opportunities for teams to enhance security and productivity. Read Now

Featured Cybersecurity

New Products

  • Unified VMS

    AxxonSoft introduces version 2.0 of the Axxon One VMS. The new release features integrations with various physical security systems, making Axxon One a unified VMS. Other enhancements include new AI video analytics and intelligent search functions, hardened cybersecurity, usability and performance improvements, and expanded cloud capabilities 3

  • Camden CV-7600 High Security Card Readers

    Camden CV-7600 High Security Card Readers

    Camden Door Controls has relaunched its CV-7600 card readers in response to growing market demand for a more secure alternative to standard proximity credentials that can be easily cloned. CV-7600 readers support MIFARE DESFire EV1 & EV2 encryption technology credentials, making them virtually clone-proof and highly secure. 3

  • Camden CM-221 Series Switches

    Camden CM-221 Series Switches

    Camden Door Controls is pleased to announce that, in response to soaring customer demand, it has expanded its range of ValueWave™ no-touch switches to include a narrow (slimline) version with manual override. This override button is designed to provide additional assurance that the request to exit switch will open a door, even if the no-touch sensor fails to operate. This new slimline switch also features a heavy gauge stainless steel faceplate, a red/green illuminated light ring, and is IP65 rated, making it ideal for indoor or outdoor use as part of an automatic door or access control system. ValueWave™ no-touch switches are designed for easy installation and trouble-free service in high traffic applications. In addition to this narrow version, the CM-221 & CM-222 Series switches are available in a range of other models with single and double gang heavy-gauge stainless steel faceplates and include illuminated light rings. 3