Anti-knife activist brings legal challenge to Met after he's detained in facial recognition failure

A close up photo of Shaun Thompson with trees in the background
-Credit: (Image: Big Brother Watch)

An anti-knife crime activist has launched a legal challenge against the Metropolitan Police after he was threatened with arrest due to a mistake by live facial recognition technology. Shaun Thompson, who volunteers with the Street Fathers youth outreach group, described the system as 'stop and search on steroids' following his 20-minute detention at London Bridge station in February.

The 38-year-old, who has put in an application for a judicial review of the use of the technology, was returning from a volunteer shift in Croydon, where the outreach group patrols to take knives off the street, when he was wrongly flagged as a suspect on the Met's facial recognition database. Mr Thompson says he was given a nudge on the shoulder after walking past a white van, then held by police for almost half-an-hour.

Mr Thompson says he was only let go after handing over a copy of his passport. Reflecting on the incident, he called the technology 'flawed' and said it felt like he was being treated as 'guilty until proven innocent'. BBC Newsnight reports the mistake may have been due to a family resemblance, but the Metropolitan Police declined to comment.

READ MORE: Horse trainer who sexually abused 'very vulnerable' teenage boy walks free

"They were telling me I was a wanted man, trying to get my fingerprints and trying to scare me with arrest, even though I knew and they knew the computer had got it wrong. Instead of working to get knives off the streets like I do, they were wasting their time with technology when they knew it had made a mistake," said Mr Thompson.

"I was angry that I had been stopped by a flawed technology that misidentified me as someone else and was treated as though I was guilty. I'm bringing this legal challenge because I don't want this to happen to other people. Facial recognition is like stop and search on steroids and doesn't make communities any safer. It needs to be stopped."

The technology has been used in London since 2016, but the Metropolitan Police is due to increase deployments by 600 per cent from 2023 to 2024, according to civil liberty group Big Brother Watch. The Met has previously defended the technology by highlighting its ability to identify violent and predatory offenders. They say the tech alerts officers to people they might want to chat to, and does not automatically mean there will be an arrest.

The Met has also reassured Londoners, who are not on the wanted list, their biometric image will be created and deleted within one second. The Met's latest deployment in Hounslow on May 21 resulted in six arrests, including a registered sex offender.

How accurate is the technology?

A police officer chatting to members of the public next to a sign about facial recognition technology
Police used the technology at the King's coronation in May last year -Credit:Andy Soloman/UCG/Universal Images Group via Getty Images

In documents released in May 2023, the Met reported a false positive identification rate of 0.017 per cent, which equates to one in 6,000 people misidentified by the technology with a watchlist (database of suspect photos) of 10,000 people. With a smaller watchlist of 1,000 people, the false positive identification rate was much lower at one in 60,000 people.

But, using a different methodology to the Met, Big Brother Watch calculated a much higher rate of false positives when setting the number of false positive identifications, against the total number of true positive identifications.

Based on data available in May last year, the campaign group arrived at a figure of 85 per cent, based on 150 false matches out of 175 total matches since 2016. That data included 95 false positives during a deployment at Notting Hill Carnival in 2017, which appears to be an outlier compared to other deployments before and since.

The Met told BBC Newsnight around one in every 33,000 people who walk by its cameras is misidentified. But the broadcaster reports the error count is higher when someone has actually been flagged, with one in 40 alerts resulting in a false positive so far this year, according to figures reported by the BBC.

A live facial recognition van being used by Metropolitan Police officers in London -Credit:Big Brother Watch
A live facial recognition van being used by Metropolitan Police officers in London -Credit:Big Brother Watch

When questioned about varying interpretations of accuracy levels by former Labour MP for Brent, Dawn Butler, during a parliamentary committee hearing in May 2023, the Met's Director of Intelligence Services, Lindsey Chiswick, said she did not recognise the much higher figures and said there had been multiple positive alerts and zero false alerts during the Met's previous last six deployments.

At the same hearing, into the governance of AI, Dr Tony Mansfield, Principal Research Scientist at National Physical Laboratory, said they had found the system to be 'very accurate', but if run at low thresholds, which are not used by the Met Police, 'the system starts showing a bias against Black males and females combined'. Dr Mansfield later said the system could be improved.

'Inaccurate and dangerously out of control'

a camera being used during trials at Scotland Yard for the new facial recognition system
Live facial recognition technology is more likely to make errors when used incorrectly -Credit:PA

Silkie Carlo, director of Big Brother Watch, said the legal challenge was a 'landmark step towards protecting the public's privacy' against the 'Orwellian' software. Ms Carlo went on to claim facial recognition in the UK is 'inaccurate and dangerously out of control' and accused the Government of spying on the population in a 'cavalier and chilling way'.

"Shaun's story is proof that facial recognition surveillance poses a real threat to the public's rights and should be urgently banned. It's vital we roll back facial recognition and defend people's rights with this groundbreaking legal action," she added.

Big Brother Watch is crowdfunding for the legal action against the Metropolitan Police and has so far raised nearly £5,000. A Met spokesperson said they are aware of the application for judicial review, but added 'it would be inappropriate to comment further'.

You can read about Big Brother Watch's methodology for calculating false positives here.

Got a story or a court case we should cover? Please email or WhatsApp 07580255582

Don't miss out on the latest crime stories from across London. Sign up to MyLondon's Court & Crime newsletter HERE