1 / 2
Facial recognition software used by Scotland Yard wrong in four out of five cases, damning report warns
Four out of five people identified as suspects by controversial facial recognition technology used at Scotland Yard were innocent, a damning new report has found.The report, commissioned by the Met Police as part of a trial, raises concerns that using of live facial recognition (LFR) breaks human rights laws.Across the six trials that were evaluated, the technology made 42 matches, and in only eight of the matches can the researchers say it made a correct match.The University of Essex Human Rights report's authors, Professor Peter Fussey and Dr Daragh Murray, have called for all live trials of LFR to be ceased until the concerns are addressed.But Duncan Ball, Deputy Assistant Commissioner, said the Met was "extremely disappointed with the negative and unbalanced tone" of the research and insisted the pilot had been successful.The Neoface system uses special cameras to scan the structure of faces in a crowd of people to create a digital image, comparing the result against a watch list made up of pictures of people who have been taken into police custody.If a match is found, officers at the scene where cameras are set up are alerted.According to the Met's website, the force has used the technology several times since 2016, including at Notting Hill Carnival that year and the following year.Use of the facial recognition is currently under judicial review in Wales following the technology's first ever legal challenge, brought against South Wales Police by Liberty.Hannah Couchman, policy and campaigns officer for the civil rights group, renewed their calls for a a ban on the technology after the research was published."This damning assessment of the Met's trial of facial recognition technology only strengthens Liberty's call for an immediate end to all police use of this deeply invasive tech in public spaces," she said."It would display an astonishing and deeply troubling disregard for our rights if the Met now ignored this independent report and continued to deploy this dangerous and discriminatory technology."We will continue to fight against police use of facial recognition which has no place on our streets."The report's authors were granted access to the final six of 10 trials run by the Metropolitan Police, running from June last year to February 2019.The research also highlighted concerns over criteria for the watch list as information was often not current, which saw police stopping people whose case had already been addressed.It also found numerous operational failures and raised a number of concerns regarding "consent, public legitimacy and trust".Dr Murray said: "This report raises significant concerns regarding the human rights law compliance of the trials."The legal basis for the trials was unclear and is unlikely to satisfy the 'in accordance with the law' test established by human rights law."It does not appear that an effective effort was made to identify human rights harms or to establish the necessity of LFR."Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police's systems from the outset, and was not an integral part of the process."Deputy Commissioner Ball said: "This is new technology, and we're testing it within a policing context."The Met's approach has developed throughout the pilot period, and the deployments have been successful in identifying wanted offenders."We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer."
Body-worn cameras that could use facial recognition software to scan crowds for criminals are being developed for possible future use by police forces.
British firm Wireless CCTV is working on the technology, which would allow users to scan crowds up to 30ft from the wearer, with footage cross-referenced in real time against a database of suspects.
It could potentially be used by transport companies to spot known fare dodgers or supermarkets to recognise repeat shoplifters.
The wifi-connected cameras would be an upgrade to body-worn ones presented by the firm at the Infosecurity Europe technology show at ExCeL last week.
Their device, already on the market with a front-facing screen to show what is being recorded, can capture footage at 1080p HD with video and audio beamed through the cloud to a controller.
Their live-streaming function is a big advance from the clip-on cameras worn by the Met Police and London Underground staff, where footage needs to be downloaded after the event.
The firm believes the technology could help “gold” commanders to issue real-time instructions to officers on the ground during major incidents such as terror attacks.
David Short, head of the WCCTV’s body-worn division, said the ability to stream live footage can also be useful for police during public order incidents, such as football hooliganism.
Some ambulance services have taken the cameras and three leading supermarket chains have also bought the wearable technology.
On the planned upgrade with facial recognition software, Mr Short said: “We’re in talks with some facial recognition software companies who have started carrying out trials on our live-streaming with a view that their algorithms will be able to process that [footage] and immediately give a ‘hit’ based on somebody who is held within the database.”
The planned addition of controversial facial recognition technology to the cameras has led to calls from civil liberties campaigners for tougher laws governing body-cam use.
Silkie Carlo, director of Big Brother Watch, said: “Many police forces have their own ‘guidance’ but there is not any actual firm national regulation.
“Adding facial recognition to body-worn cameras would be a leap towards a Minority Report-style society, quite unfitting for a democracy.”
Mr Short said there was a code of practice governing the body-cams’ use and claimed that the companies using the front-facing screen versions had reported a 70 per cent reduction in the number of violent incidents.