Apple’s plan to scan for child abuse images ‘tears at heart of privacy’

Technology like that proposed by Apple to search iPhones for child sexual abuse images would open the door to mass surveillance and be vulnerable to exploitation, world-leading security and cryptography experts have said.

Client-side scanning (CSS) gives access to data on users’ devices, including stored data, which “brings surveillance to a new level”, according to analysis from academics at Harvard Kennedy school, Massachusetts Institute of Technology (MIT) and the University of Cambridge, among others.

They write that the technology, which introduces background software on users’ devices, “tears at the heart of privacy of individual citizens” but is also fallible and could be evaded by those meant to be targeted, and misused.

In Bugs in Our Pockets, the Risks of Client-Side Scanning, a 46-page analysis of CSS published on open-access website arXiv on Friday, the authors say: “In reality, CSS is bulk intercept, albeit automated and distributed … CSS makes law-abiding citizens more vulnerable with their personal devices searchable on an industrial scale.

“Plainly put, it is a dangerous technology. Even if deployed initially to scan for child sex-abuse material, content that is clearly illegal, there would be enormous pressure to expand its scope. We would then be hard-pressed to find any way to resist its expansion or to control abuse of the system.”

Apple’s plans, unveiled this year, involve a technique called “perceptual hashing” to compare photos with known images of child abuse when users upload them to the cloud. If the company detects enough matches, it would manually review the images before flagging the user account to law enforcement.

Apple paused implementation after a backlash from privacy campaigners last month but not before researchers managed to construct vastly different images that produced the same fingerprint and thus would appear identical to Apple’s scanning system, creating false positives.

Others managed to do the reverse: change the mathematical output of an image without changing how it looks at all, thereby creating false negatives.

The report’s authors say people may also try to disable scanners or avoid using devices such as iPhones with CSS. They added: “The software provider, the infrastructure operator, and the targeting curator must all be trusted. If any of them – or their key employees – misbehave, or are corrupted, hacked or coerced, the security of the system may fail.”

While CSS may be mooted as intended to target specific content, the report warns: “Come the next terrorist scare, a little push will be all that is needed to curtail or remove the current protections.”

It points out that Apple appears to have yielded to state pressure before, such as by moving the iCloud data of its Chinese users to datacentres under the control of a Chinese state-owned company, and removing the jailed Russian opposition leader Alexei Navalny’s tactical voting app from its Russian app store.

Ross Anderson, one of the report’s co-authors and a professor of security engineering at the University of Cambridge, said: “It’s a very small step from there [targeting child sexual abuse material] to having various governments saying ‘here’s a list of other images that we’d like to put on the naughty pictures list for iPhones in our country’.”

Approached for comment, Apple referred the Guardian to a statement that said: “Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”