Facebook icon
Twitter icon
e-mail icon

UK court ruling finds police use of facial recognition technology a violation of human rights

File photograph.

The UK Court of Appeal ruled the British law enforcement’s facial recognition usage is “unlawful”, in a landmark case being celebrated by human rights campaigners this week.  

Judges stated that South Wales Police breached privacy rights, data protection laws and equality laws on Tuesday, following a legal challenge brought by civil rights group Liberty.

The use of automated facial recognition technology was assumed on a trial basis by the South Wales Police in 2017, when a system called AFR Locate was deployed at several dozen major events such as football matches. Police matches scan against watchlists of known individuals to identify wanted persons, had open warrants against them, or were in some way or another ‘persons of interest’.

The Court of Appeal judgment stated that South Wales Police had never sought to investigate that the software being used "does not have an unacceptable bias on grounds of race or sex.”

In 2019, Cardiff resident Ed Bridges filed a lawsuit against the police, alleging that having his face scanned in 2017 and 2018 was a violation of his legal rights with his case was backed by Liberty. Though Bridges lost the suit in 2019, the verdict was overturned by Tuesday’s ruling.

"Too much discretion is currently left to individual police officers," the court ruled. "It is not clear who can be placed on the watchlist, nor is it clear that there are any criteria for determining where AFR can be deployed."

In 2018, the South Wales Police released data admitting that approximately 2,300 of nearly 2,500 matches—roughly 92 percent—the software made at an event in 2017 were false positives.

The High Court found that up to 500,000 people may have been scanned by South Wales Police as of May 2019.

The UK government Surveillance Camera Commissioner’s Office, tasked with encouraging compliance with a surveillance camera code of practice responded that, “This case has been very much focused on the specific deployments of AFR by South Wales Police, but the outcome is much more far-reaching with regard to the use of this technology by policing more generally.” This includes the Metropolitan Police of London who have recently started using similar facial recognition technology.

“It is time for the government to recognise the serious dangers of this intrusive technology. Facial recognition is a threat to our freedom – it has no place on our streets,” said Megan Goulding, a lawyer at Liberty.

The victory is hailed against a similar backdrop in places such as the United States, where nationwide civil rights protest movements in support of Black communities and against police brutality rock the country continue.

The ACLU in June filed a formal complaint against police in Detroit after they arrested an innocent man based on a false positive match from a facial ID system. Detroit police chief admitted that the system misidentifies suspects 96 percent of the time.

South Wales Police said that it would not be appealing the court’s decision but does dispute calls to halt facial recognition technologies altogether. The court’s judgement does not “fundamentally undermine the use of facial recognition technology,” but does require changes are made to the system and policies in place.

See more from The Guardian here, the BBC here and the Morning Star here.

We need your support

Sri Lanka is one of the most dangerous places in the world to be a journalist. Tamil journalists are particularly at threat, with at least 41 media workers known to have been killed by the Sri Lankan state or its paramilitaries during and after the armed conflict.

Despite the risks, our team on the ground remain committed to providing detailed and accurate reporting of developments in the Tamil homeland, across the island and around the world, as well as providing expert analysis and insight from the Tamil point of view

We need your support in keeping our journalism going. Support our work today.