Australian federal police tested controversial facial recognition search engine, FOI documents reveal

<span>Photograph: Getty Images</span>
Photograph: Getty Images

The Australian federal police have tested a controversial facial recognition search engine, possibly for operational use.

Documents obtained by Guardian Australia under freedom of information laws show hundreds of connections between AFP devices and the website PimEyes between 1 January and 4 August this year.

There were also connections to a similar site, FaceCheck.ID, however multiple network connection requests can occur during a website visit.

Related: Australian federal police officers’ details leaked on dark web after law firm hack

Both platforms allow users to upload a photo of someone and then find similar images across the internet. PimEyes’ chief executive, Giorgi Gobronidze, recently told the New York Times the site has a database of nearly three billion faces and enables about 118,000 searches per day.

The AFP originally told Guardian Australia that “a small number of members” had accessed the sites as part of professional development training and to assess their possible use in “the law enforcement or criminal environment”.

It said both platforms had not been endorsed for use within the AFP.

Under questioning from Greens senator David Shoebridge in Senate estimates on Monday night, the AFP said that it is now aware that the platform may have been tested for operational purposes on around 10 occasions, which are now being investigated: nine on PimEyes and one on FaceCheck.ID.

The AFP is “reviewing all use of the platform, and making sure that does not occur again”, the AFP’s chief operating officer, Charlotte Tressler, told estimates.

It appears leadership was not aware of these uses until Guardian Australia’s FOI request.

“Pimeyes.com is a particularly dangerous facial recognition tool … and has been repeatedly criticised for enabling unlawful surveillance and stalking,” Shoebridge said, noting the tool is based in the former Soviet Republic of Georgia.

“This keeps happening with the AFP, whether it’s Clearview, PimEyes or FaceCheck.”

In 2021, the Australian privacy commissioner Angelene Falk found the AFP had failed to “comply with its privacy obligations” in using the Clearview AI facial recognition platform – a similar face search engine – and had not completed a privacy impact assessment before trialling the tool.

Related: Chinese academic raided by Australian police and offered $2,000 for information during trip

It was revealed in 2020 that staff had used free trials of Clearview, provided by the company. Documents released at the time showed staff from the Australian Centre to Counter Child Exploitation conducted searches for five “persons of interest”, among other uses.

While PimEyes, which is subscription-based, stipulates its technology should be used to search for one’s own face, there are no restrictions to enforce this. Trials of the tool have found images in disconcerting locations, including in news articles, wedding photography pages and reviews.

PimEyesclaims to go beyond typical reverse image search by using face recognition technology. “In the results we display not only similar photos to the one you have uploaded to the search bar but also pictures in which you appear on a different background, with other people, or even with a different haircut,” the site states.

The PimEyes site was also accessed between 1 January to 26 July 2023 by Australian Border Force devices, according to documents obtained by Guardian Australia, but home affairs told estimates it believed the visits by around four staff were not in any official capacity.

Radi Kovacevic, acting group manager for technology and major capability at home affairs, said it did not appear anyone’s image was uploaded to PimEyes. There is now a prohibition on accessing the site.