Board Approves Detroit Police

Board Approves Detroit Police's Limited Use of Facial Recognition Software

Detroit police will continue to use the software to identify violent crime suspects but are prohibited from using it on live or recorded video.

Following months of controversy over the Detroit police department’s use of facial recognition software on residents, the Detroit Board of Police Commissioners approved a revised policy on Thursday that allows the department to continue using the technology with restrictions.

The policy, submitted by the police department to the advisory board, limits the use of facial recognition to still images of people suspected of violent crimes or home invasion. In addition, police cannot use the software on live or recorded video, or to determine a person’s immigration status.

Lisa Carter, the chairwoman of the board, said that the proposal incorporated at least 23 changes suggested by residents and critics of the department. Some of those included not sharing photos in the facial recognition database with private companies and stronger penalties for officers who are found to have misused the tech.

Revised Facial Recognition ... by Mark J. Rochester on Scribd

“I believe the prohibitions contained in the revised directive address many of the concerns raised by the public," Carter said, according to The Detroit Free Press.

"The revised directive is not a complete ban on the use of facial recognition,” she added. “The revised directive gives clear direction and lines of authority to the department as to when and how such technology can and cannot be used."

Critics say that any use of facial recognition tech by police is unacceptable given studies that show the software is less accurate in identifying people of color and women. The public did not have the chance to address the board before the vote, further angering attendees who were against the policy.

Willie Burton, a member of the police commissioners board who voted against the policy, said that Detroit should be the last city to implement facial recognition. About 80 percent of Detroit’s residents are black, a higher percentage than any other large city in the U.S., according to NBC News.

“I feel like the technology itself is techno-racism," Burton said. "With this technology, everyone looks alike."

Police Chief James Craig applauded the board for approving the policy and noted that the department has committed to not arrest solely based on a facial recognition match. Instead, officers must have other evidence connecting the suspect to the crime in addition to the match.

“Just like a few nights ago, we had a robbery. It would’ve been a whodunnit, and had it not been for the use of that technology, an armed robbery, we would not have identified that suspect,” Craig told the Free Press. “We cannot arrest solely on the identification from facial recognition. We can’t do it, and we’re not going to do it.”

He said that the main purpose of the software is to support victims and their families by identifying criminals as fast as possible. But civil rights groups fear that the technology could eventually be used to surveil the public (which is banned in the revised policy) and have a disparate impact on people of color, particularly black residents.

"At its core, facial recognition is a way to do mass profiling, which is the last thing a majority black city needs," Amanda Alexander, Detroit Justice Center executive director, said in a statement. "Rather than investing millions of dollars in facial recognition technology that instills fear and targets communities of color, we should be investing in services and resources so that people can prosper."

The department had been using the software for about two years before submitting a policy to the oversight board. In 2017, Detroit police installed facial recognition for the city’s Project Greenlight system, a public-private initiative that encouraged businesses to purchase video surveillance cameras in an effort to deter crime.

After the vote, Craig said that the debate focuses too little on “violent, predatory suspects” who shoot and kill people.

“We use the technology constitutionally, effectively, and it aids us in identifying that violent suspect,” Craig said, according to Michigan Radio.

Amanda Hill, a 28-year-old member of the Green Light Black Futures organization that aims to end the Greenlight system in the city, said that facial recognition does not make her community safer.

“Facial recognition technology does not keep us safe," Hill told the Free Press. "Healthy communities require investment — not over-policing, not surveillance."

Featured

  • The Next Generation

    Video security technology has reached an inflection point. With advancements in cloud infrastructure and internet bandwidth, hybrid cloud solutions can now deliver new capabilities and business opportunities for security professionals and their customers. Read Now

  • Help Your Customer Protect Themselves

    In the world of IT, insider threats are on a steep upward trajectory. The cost of these threats - including negligent and malicious employees that may steal authorized users’ credentials, rose from $8.3 million in 2018 to $16.2 million in 2023. Insider threats towards physical infrastructures often bleed into the realm of cybersecurity; for instance, consider an unauthorized user breaching a physical data center and plugging in a laptop to download and steal sensitive digital information. Read Now

  • Enhanced Situation Awareness

    Did someone break into the building? Maybe it is just an employee pulling an all-nighter. Or is it an actual perpetrator? Audio analytics, available in many AI-enabled cameras, can add context to what operators see on the screen, helping them validate assumptions. If a glass-break detection alert is received moments before seeing a person on camera, the added situational awareness makes the event more actionable. Read Now

  • Transformative Advances

    Over the past decade, machine learning has enabled transformative advances in physical security technology. We have seen some amazing progress in using machine learning algorithms to train computers to assess and improve computational processes. Although such tools are helpful for security and operations, machines are still far from being capable of thinking or acting like humans. They do, however, offer unique opportunities for teams to enhance security and productivity. Read Now

Featured Cybersecurity

New Products

  • Connect ONE’s powerful cloud-hosted management platform provides the means to tailor lockdowns and emergency mass notifications throughout a facility – while simultaneously alerting occupants to hazards or next steps, like evacuation.

    Connect ONE®

    Connect ONE’s powerful cloud-hosted management platform provides the means to tailor lockdowns and emergency mass notifications throughout a facility – while simultaneously alerting occupants to hazards or next steps, like evacuation. 3

  • FEP GameChanger

    FEP GameChanger

    Paige Datacom Solutions Introduces Important and Innovative Cabling Products GameChanger Cable, a proven and patented solution that significantly exceeds the reach of traditional category cable will now have a FEP/FEP construction. 3

  • Compact IP Video Intercom

    Viking’s X-205 Series of intercoms provide HD IP video and two-way voice communication - all wrapped up in an attractive compact chassis. 3