Facial recognition tech: watchdog calls for code to regulate police use


The information commissioner has expressed concern over the lack of a formal legal framework for the use of facial recognition cameras by the police.

A barrister for the commissioner, Elizabeth Denham, told a court the current guidelines around automated facial recognition (AFR) technology were “ad hoc” and a clear code was needed.

In a landmark case, Ed Bridges, an office worker from Cardiff, claims South Wales police violated his privacy and data protection rights by using AFR on him when he went to buy a sandwich during his lunch break and when he attended a peaceful anti-arms demonstration.

The technology maps faces in a crowd and then compares them with a watchlist of images, which can include suspects, missing people or persons of interest to the police. The cameras have been used to scan faces in large crowds in public places such as streets, shopping centres, football crowds and music events such as the Notting Hill carnival.

On the final day of the hearing, Gerry Facenna QC, for the information commissioner, said there was a lack of “clarity and certainty” over how police watchlists were drawn up. Facenna said: “There ought to be a proper and clear code that has been consulted on.”

He added: “If you live in a police state where everyone is monitored all the time, no doubt crime will fall.” He said the state needed to grapple with a “balancing exercise”.

Facenna said a legal framework should address the nature of a watchlist and in what circumstances the technology was deployed. “Can you roll it out at every sports match? Does it need to be intelligence led? What can you do with the footage?”

Facenna said there were also questions around what training AFR operators should have, how to ensure the technology was not hacked and if people could refuse to be scanned.

In his closing speech, Dan Squires QC, for Bridges, said AFR gave police “extraordinary power”. Squires said: “If you have someone’s biometric data and you have a series of CCTV cameras, you are able to log someone’s movements around the city or potentially around the country if AFR is rolled out.”

South Wales police argued during the hearing at the Cardiff civil justice and family centre that the cameras prevented crime, protected the public and did not breach the privacy of innocent people whose images were captured.

The case has been adjourned and two judges will make a ruling at a later date yet to be fixed.

Speaking after the hearing, the deputy chief constable Richard Lewis said: “This process has allowed the court to scrutinise decisions made by South Wales police in relation to facial recognition technology. We welcomed the judicial review and now await the court’s ruling on the lawfulness and proportionality of our decision making and approach during the trial of the technology.

“The force has always been very cognisant of concerns surrounding privacy and understands that we, as the police, must be accountable and subject to the highest levels of scrutiny to ensure that we work within the law.”