Apple has announced plans to scan iPhones for images of child abuse, raising immediate concerns about user privacy and surveillance with the move.
Will Apple’s iPhone become iSpy?
Apple says its system is automated, doesn’t scan actual images itself, uses some form of hashed data system to identify known instances of child sexual abuse material (CSAM) and says it has multiple safes to protect privacy.
Privacy advocates warn that now that it’s created such a system, Apple is on a hard road to the inevitable extension of scanning and reporting content on devices that can – and likely, will – be abused by some countries.
What the Apple system does
There are three main elements to the system, which will lurk inside iOS 15, iPadOS 15 and macOS Monterey when they ship later this year.
-
Scan your image
Apple’s systems scan all images stored in iCloud Photos to see if they match the CSAM database held by the National Center for Missing and Exploited Children (NCMEC).
Images are scanned on the device using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further turns this database into an unreadable set of hashes that are stored securely on the user’s device.
When images are stored in iCloud Photos, a matching process occurs. If the account crosses the threshold for some known instances of CSAM content, Apple will be notified. If alerted, data is reviewed manually, account is deactivated and NCMEC is notified.
However, the system is not perfect. The company says there is a less than one-in-one-trillion chance of mis-flagging an account. Apple has over a billion users, so that means there’s a 1/1,000 better chance of someone being misidentified every year. Users who think they have been wrongly flagged can appeal.
The image is scanned on the device.
-
Scan your messages
Apple’s systems use machine learning on devices to scan images in Messages sent or received by minors for sexually explicit material, alerting parents if the image is identified. Parents can enable or disable the system, and any content received by a child will be blurred.
If a child tries to post sexually explicit content, they will be warned and parents can be notified. Apple said it did not gain access to the images, which were scanned on the device.
-
Watch what you’re looking for
The third part consists of Siri and Search updates. Apple says it will now provide parents and children with expanded information and assistance if they encounter an unsafe situation. Siri and Search will also intervene when people make what are considered CSAM-related search queries, explaining that interest in this topic is problematic.
Apple helped tell us that its program is “ambitious” and its efforts will “grow and evolve over time.”
A little technical data
The company has published an extensive technical white paper that explains a bit more about its system. In the paper, it takes a lot of effort to convince the user that it doesn’t learn anything about the images that don’t match the database,
Apple’s technology, called NeuralHash, analyzes known CSAM images and converts them into a unique number specific to each image. Only another image that looks nearly identical can produce the same number; for example, images that differ in size or transcode quality will still have the same NeuralHash value.
When images are added to iCloud Photos, they are compared against that database to identify matches.
If a match is found, a cryptographic security voucher is generated, which, as I understand it, will also allow Apple reviewers to decrypt and access the offending image if the content threshold is reached and action is required.
“Apple can learn relevant image information only after the account has more than the CSAM match threshold number, and even then, only for matching images,” the paper concludes.
Apple isn’t unique, but on-device analysis is possible
Apple is not alone when it comes to being asked to share CSAM images with authorities. By law, any US company that finds such material on its servers must cooperate with law enforcement to investigate. Facebook, Microsoft and Google already have technology that scans such material shared via email or messaging platforms.
The difference between that system and this one is that the analysis is done on the device, not on the company’s servers.
Apple has always claimed its messaging platforms are end-to-end encrypted, but this becomes a minor semantic claim if the content of one’s device is scanned before the encryption even occurs.
Child protection, of course, is something most rational people support. But what privacy advocates are concerned about is that some governments may now try to force Apple to look for other material on people’s devices.
Governments that prohibit homosexuality might demand that such content also be monitored, for example. What happens if a teenager in a country where non-binary sexual activity is banned asks Siri to help her out? And what about nearby listening devices, like HomePods? It’s not clear which search-related components of this system are in use there, but it probably is.
And it’s not yet clear how Apple will be able to protect against such missions.
Privacy advocates are very worried
Most privacy advocates feel there is significant opportunity for the creeping mission inherent in this plan, which does nothing to maintain faith in Apple’s commitment to user privacy.
How can any user feel that privacy is protected if the device itself is spying on them, and they have no control how?
The Electronic Frontier Foundation (EFF) warns this plan is effectively creating a security backdoor.
“What is needed to expand the narrow backdoor Apple is building is either expanding machine learning parameters to look for additional content types, or tweaking configuration flags to scan for, not just kids accounts, but anyone’s accounts. It’s not a slippery slope; it is a completely built system just waiting for external pressure to make the slightest change. ”
“When Apple develops technology capable of scanning encrypted content, you can’t just say, ‘I wonder what the Chinese government is going to do with that technology.’ It’s not theoretical,” the Johns Hopkins professor warned Matthew Green.
Alternative argument
There is another argument. One of the most interesting is that servers at ISPs and email providers are already scanned for the content, and that Apple has built a system that minimizes human involvement and only flags problems if it identifies multiple matches between the CSAM database and the content on the device.
There is no doubt that children are at risk.
Of the nearly 26,500 escapees reported to NCMEC in 2020, one in six is likely a victim of child sex trafficking. The organization CyberTipline, (which I imagine Apple is connected to in this case) received more than 21.7 million reports related to some form of CSAM in 2020.
John Clark, president and CEO of NCMEC, said: “With so many people using Apple products, these new security measures have the potential to save the lives of children who are seduced online and whose horrific images are circulating on CSAM. At the National Center for Missing & Exploited Children, we know this crime can only be eradicated if we are steadfast in our dedication to protecting children. We can only do this because technology partners, like Apple, step up and make their dedication known.”
Others say that by creating a system to protect children from such horrific crimes, Apple is removing an argument that some might use to justify device backdoors in a broader sense.
Most of us agree that children should be protected, and in doing so Apple has eroded the arguments that some repressive governments might use to impose problems. Now he must fight every creeping mission on the side of such a government.
That last challenge is the biggest problem, considering Apple when pushed will always follow the laws of the government in the country where it does business.
“No matter how well-intentioned, Apple is launching mass surveillance around the world with this,” the privacy advocate warns Edward Snowden. If they can scan CSAM today, “they can scan anything tomorrow”.
Please follow me on Indonesiaor join me at the AppleHolic bar & grill and Apple Discussion group on MeWe.