News

Apple Will Scan iPhones For Child Sex Abuse Images. Privacy Experts Are Worried

"Governments will demand [this technology] from everyone."

by Lizzy Francis
Someone holds an iPhone
Pexels

Apple has announced a new system that will scan every iPhone to search for child sexual abuse imagery and material.

The new system is designed to detect images of child sexual abuse on customers’ devices in the United States before they are uploaded to iCloud. If an image is detected, a human reviewer will review the photo to confirm that it is child sexual abuse and report the finding to law enforcement and/or the National Center for Missing and Exploited Children. The person’s iCloud account will also be disabled, per Apple, NPR, and the BBC.

Apple’s new technology, called “NeuralHash,” will work by utilizing existing photos of child sexual abuse material from the US National Center for Missing and Exploited Children as well as from other child safety groups. Images are turned into “hashes,” which are numerical codes that can then be matched to images of child sexual abuse on Apple devices, even if they are edited. NeuralHash will find the “hashes” and be able to identify images that way.

Apple will also “scan users’ encrypted messages for sexually explicit content as a child safety measure.”

These moves, which Apple says is to tamp down on the proliferation of child pornography and sexual images of children, has been criticized by privacy experts because it grants governments and private entities access to what people do on their phones.

Speaking to the BBC, security expert Matthew Green of Johns Hopkins University expressed concern. “Regardless of what Apple’s long-term plans are, they’ve sent a very clear signal,” he said. “In their (very influential) opinion, it is safe to build systems that scan users’ phones for prohibited content… Whether they turn out to be right or wrong on that point hardly matters. This will break the dam — governments will demand it from everyone.”

Apple says that the technology offers privacy in that it only learns about a user’s photos “if they have a collection of known child sexual abuse images” in their iCloud.

Still, privacy experts are worried that the technology can be utilized to stop any level of speech, information sharing, image sharing, or more that governments — in particular authoritarian ones — would want to squash.

Prohibited content, then, could not only just mean child pornography. It could also mean political speech or could be used by authoritarian governments to crush dissent, or used as a mass surveillance tool.

On Twitter, Green also noted that because Apple operates the only remaining encrypted messaging service in China, “when Apple develops a technology that’s capable of scanning encrypted content, you can’t just say, ‘well, I wonder what the Chinese government would do with that technology. It isn’t theoretical.’”

In a series of tweets, Edward Snowden shared a similar sentiment, noting that not only will Apple update every phone continuously to compare photos and cloud storage to a backlist, but it will also tell “your parents if you view a nude in iMessage.”

Steven Murdoch