Call for UK’s biggest police force to stop using facial recognition technology

Fourteen campaign groups have written an open letter to new Metropolitan Police Commissioner Sir Mark Rowley calling on him to end the use of facial recognition technology.

Organisations including Big Brother Watch, Liberty and Black Lives Matter UK contacted the new force leader on his first day in the job to demand an end to the “privacy-eroding, inaccurate and wasteful technology”.

They claim that 87% of the alerts generated by the force’s systems are misidentifications, including a 14-year-old black schoolboy in uniform, and a French exchange student who had only been in the country for a few days.

Sir Mark Rowley in police uniform outside New Scotland Yard when he was Assistant Commissioner.
Sir Mark Rowley begins his new job as Metropolitan Police Commissioner on Monday (Victoria Jones/PA)

The force uses a different way of calculating false alerts using the proportion of all people estimated to have walked past cameras, and says that the percentage is dramatically lower, between 0% and 0.08%.

The campaigners’ letter says: “During observations at deployments, Big Brother Watch has witnessed multiple false positive matches, which have led to innocent individuals being forced to prove their identity to police officers.

“If the use of this technology becomes more widespread, these incidents will become commonplace, resulting in further injustices and increased public mistrust of the Met.”

The groups maintain that the technology used by the Met is less accurate for women, and they say that it is being used more frequently in areas with a higher number of people from ethnic minorities.

Silkie Carlo, director of Big Brother Watch, said: “Public trust in the police has collapsed in the capital and is being further damaged by the Met’s repeated use of Orwellian facial recognition technology which is both useless and highly invasive.

“These Minority Report style cameras have done absolutely nothing to reduce high rates of violent crime but risk putting our police on a par with those in surveillance states like China and Russia. They have no place in a democracy.

“Millions of Londoners’ faces have been scanned by facial recognition cameras without their consent, and without many parliamentarians’ awareness.

“If the new Commissioner is serious about fighting crime effectively while addressing discrimination in policing, he cannot endorse the use of a technology with an 87% failure rate, that pointlessly drains police resources, and is well known to have issues with racist and sexist misidentifications, many of which we’ve witnessed.

“It is critical that Sir Mark Rowley urgently stops police use of live facial recognition surveillance and takes a smarter approach towards technology in policing.”

Martha Spurrier, director of Liberty, said: “We all have the right to go about our lives without being surveilled by the police.

“But the Metropolitan Police’s use of live facial recognition is violating our rights and threatening our liberties.”

The force said that its use of the technology has been focused on tackling violent and serious crime.

A spokeswoman said: “Live Facial Recognition (LFR) is a technology that has been helping the Met to locate dangerous individuals and those who pose a serious risk to our communities.

“The Met has primarily focused the use of LFR on the most serious crimes; locating people wanted for violent offences, including knife and gun crime, or those with outstanding warrants who are proving hard to find.

“Operational deployments of LFR technology have been in support of longer term violence reduction initiatives and have resulted in a number of arrests for serious offences including conspiracy to supply Class A drugs, assault on emergency service workers, possession with intent to supply Class A & B drugs, grievous bodily harm and being unlawfully at large having escaped from prison.

“False alert rates across our operational deployments are between 0% and 0.08%.”

She said work is ongoing to test whether the algorithm used by the force is less accurate for certain groups of people.