facial recognition system

Majority of Facial Recognition Systems Are Less Accurate For People of Color, Federal Study Finds

Native Americans had the highest rates of false positives, while African-American women were most likely to be misidentified in a law enforcement database.

A new study, released by the National Institute of Standards and Technology (NIST) on Thursday, finds that a majority of commercial facial recognition systems are less accurate when identifying people of color, particularly African-Americans, Native Americans and Asians.

The federal agency conducted tests on 189 facial recognition algorithms from 99 developers, including systems from Microsoft, Chinese intelligence company Megvii and more. Systems from Amazon, Apple, Facebook and Google were not tested because none of the companies submitted algorithms to the study, according to The New York Times.

Algorithms developed in the U.S. showed high rates of false positives relative to images of white people, with Native Americans having the highest rates of false positives.

“While it is usually incorrect to make statements across algorithms, we found empirical evidence for the existence of demographic differentials in the majority of the face recognition algorithms we studied,” Patrick Grother, a NIST computer scientist and the report’s primary author, said in a statement. “While we do not explore what might cause these differentials, this data will be valuable to policymakers, developers and end users in thinking about the limitations and appropriate use of these algorithms.”

Notably, the study found that algorithms developed in Asia did not demonstrate the same “dramatic” difference in false positives between Asian and Caucasian faces. Grother said that although the study does not explore the causes behind the false positives, the issue could be that American algorithms are using data sets with primarily Caucasian faces to train their facial recognition systems, making it difficult for those algorithms to accurately identify people of color.

“These results are an encouraging sign that more diverse training data may produce more equitable outcomes, should it be possible for developers to use such data,” Grother said.

On a FBI database of 1.6 million domestic mugshots, the report found higher rates of false positives for African-American women. The accuracy issue for law enforcement particularly concerns civil liberties groups who argue that the facial recognition algorithms, still in their infancy, could lead to false accusations, arrests and potential imprisonment.

“One false match can lead to missed flights, lengthy interrogations, watch list placements, tense police encounters, false arrests or worse,” Jay Stanley, a policy analyst at the American Civil Liberties Union, said in a statement. “Government agencies including the F.B.I., Customs and Border Protection and local law enforcement must immediately halt the deployment of this dystopian technology.”

The study was published as towns and states across the country consider issuing moratoriums on government use of facial recognition. California will implement a three-year moratorium starting in 2020, and towns in Massachusetts have banned law enforcement use of the systems.

Meanwhile, U.S. Customs and Border Protection was pressured to drop plans to expand mandatory facial recognition scans to Americans entering and exiting the country. The practice is already standard for foreign travelers coming into and leaving the U.S.

About the Author

Haley Samsel is an Associate Content Editor for the Infrastructure Solutions Group at 1105 Media.

Featured

Featured Cybersecurity

Webinars

New Products

  • Luma x20

    Luma x20

    Snap One has announced its popular Luma x20 family of surveillance products now offers even greater security and privacy for home and business owners across the globe by giving them full control over integrators’ system access to view live and recorded video. According to Snap One Product Manager Derek Webb, the new “customer handoff” feature provides enhanced user control after initial installation, allowing the owners to have total privacy while also making it easy to reinstate integrator access when maintenance or assistance is required. This new feature is now available to all Luma x20 users globally. “The Luma x20 family of surveillance solutions provides excellent image and audio capture, and with the new customer handoff feature, it now offers absolute privacy for camera feeds and recordings,” Webb said. “With notifications and integrator access controlled through the powerful OvrC remote system management platform, it’s easy for integrators to give their clients full control of their footage and then to get temporary access from the client for any troubleshooting needs.” 3

  • ResponderLink

    ResponderLink

    Shooter Detection Systems (SDS), an Alarm.com company and a global leader in gunshot detection solutions, has introduced ResponderLink, a groundbreaking new 911 notification service for gunshot events. ResponderLink completes the circle from detection to 911 notification to first responder awareness, giving law enforcement enhanced situational intelligence they urgently need to save lives. Integrating SDS’s proven gunshot detection system with Noonlight’s SendPolice platform, ResponderLink is the first solution to automatically deliver real-time gunshot detection data to 911 call centers and first responders. When shots are detected, the 911 dispatching center, also known as the Public Safety Answering Point or PSAP, is contacted based on the gunfire location, enabling faster initiation of life-saving emergency protocols. 3

  • AC Nio

    AC Nio

    Aiphone, a leading international manufacturer of intercom, access control, and emergency communication products, has introduced the AC Nio, its access control management software, an important addition to its new line of access control solutions. 3