British Court Rules in Favor of Police In Unprecedented Facial Recognition Case
Deciding on one of the first cases ever to consider the legality of live facial recognition, a British court found that police use of the tool did not violate privacy laws or human rights.
- By Haley Samsel
- Sep 05, 2019
In a case with potentially wide ranging ramifications for security companies in Europe and across the globe, a British court ruled against a man who challenged police use of automated, or live, facial recognition technology. The High Court said it was the first time any court in the world had considered the use of facial recognition.
Ed Bridges sued the South Wales Police Department earlier this year, arguing that his human rights were violated when he was recorded without permission while Christmas shopping and later while attending a political rally. Bridges and the civil rights group Liberty, which sued on his behalf, said police use of the tool was also a breach of data protection and equality laws.
However, the High Court dismissed the suit on Tuesday, stating that the South Wales Police met the requirements of Britain’s Human Rights Act and that there were “sufficient legal controls” to prevent the department’s abuse of the technology, including its policy of deleting data unless it pertained to a person identified from the watch list.
Bridges and his lawyers vowed to appeal the decision, stating that the judgment does not reflect the “very serious threat” that facial recognition poses to society. Through appeals, Bridges could take the case all the way to Britain’s Supreme Court.
“This sinister technology undermines our privacy and I will continue to fight against its unlawful use to ensure our rights are protected and we are free from disproportionate government surveillance,” Bridges said in a statement.
Police officials praised the decision, noting that the technology has allowed them to fight crime efficiently despite budget cuts. Alun Michael, the police and crime commissioner for South Wales, said that keeping communities safe has become “increasingly difficult” because government funds for police have been cut by a third in recent years.
“That has made it essential to use innovation and embrace technology like Facial Recognition if we are to have any hope of maintaining police numbers in our local communities across South Wales,” Michael said in a statement.
He added that the court appeared to recognize how his police department has prioritized balancing the protection of privacy rights with keeping the public safe. Matt Jukes, who oversees South Wales as chief constable of the police force, called the camera system and policies surrounding its use “innovative work.”
While he welcomes the decision, Jukes said he knows it is not the end of the “wider debate” around the use of facial recognition in public life.
“There is, and should be, a political and public debate about wider questions of privacy and security,” Jukes said. “It would be wrong in principle for the police to set the bounds of our use of new technology for ourselves.”
The Information Commissioner’s Office, which serves as the top privacy and data rights watchdog in Britain, said it is also reviewing the judgment carefully.
In a statement, a spokesperson said the office welcomes the court’s finding that live facial recognition systems require compliance with existing data protection laws due to their processing of sensitive personal data. The ICO recently finished its own investigation of police pilot programs of the technology and will release recommendations and guidance to police departments soon, the spokesperson said.
“This new and intrusive technology has the potential, if used without the right privacy safeguards, to undermine rather than enhance confidence in the police,” the ICO said.