In a surprise announcement Friday, Apple said it would need more time to improve its controversial child safety tool before introducing it.
More feedback sought
The company says it plans to get more feedback and improve the system, which has three main components: iCloud photo scans for CSAM materials, messages scans on devices to protect children, and search suggestions designed to protect children.
Since Apple announced the tool, it has faced a barrage of criticism from concerned individuals and rights groups from around the world. The big argument that the company seems to have problems dealing with appears to be the potential for a repressive government to force Apple to monitor more than just CSAM.
Who is watching over the guards?
Edward Snowden, accused of leaking US intelligence and now a privacy advocate, warned on Twitter“Make no mistake: if they can scan child pornography today, they can scan anything tomorrow.”
Critics say the tool can be exploited or extended to support censorship of ideas or threaten free thought. Apple’s response — that it wouldn’t extend the system — was seen as a little naive.
“We have faced demands to build and implement government-mandated changes that degraded user privacy previously and resolutely reject those demands. We will continue to reject them in the future. Let’s be clear, this technology is limited to detecting CSAM stored in iCloud and we will not agree to any government request to expand it,” the company said.
“What is needed to expand the narrow backdoor Apple is building is an expansion of machine learning parameters to look for additional types of content,” the Electronic Frontier Foundation replied.
Apple listens to its users (in a good way)
In a statement widely released to the media (on the Friday before a US holiday, when bad news is sometimes released) about the suspension, Apple said:
“Based on feedback from customers, advocacy groups, researchers and others, we’ve decided to take additional time over the coming months to gather feedback and make improvements before releasing this critically important child safety feature.”
This is a step the company must take. In mid-August, more than 90 NGOs contacted the company in an open letter asking for its reconsideration. The letter was signed by Liberty, Big Brother Watch. ACLU, Center for Democracy & Technology, Center for Free Expression, EFF, ISOC, Privacy International, and many more.
The devil in the details
The organization warned of several weaknesses in the company’s proposal. One that is very cutting off: that the system itself can be abused by abusive adults.
“LGBTQ+ youth in family accounts with unsympathetic parents are particularly risky,” they wrote. “As a result of this change, iMessages will no longer provide confidentiality and privacy to these users.”
Concerns that Apple’s proposed system could be extended also remain. Sharon Bradford Franklin, co-director of the CDT Security & Surveillance Project, warned that the government “would demand that Apple scan and block images of human rights abuses, political protests, and other content that must be protected as freedom of expression, which forms the backbone of a free and secure society.” democratic.”
Apple’s defenders say what Apple is trying to achieve is to maintain overall privacy on user data while creating a system that can only retrieve illegal content. They also point to the various failsafes the company has built into its systems.
Those arguments didn’t work, and Apple executives must have received the same social media feedback I’ve seen, which represented deep distrust of the proposal.
What happened next?
Apple’s statement does not say. But given that the company has spent weeks since meeting the announcement with the media and concerned groups from across its markets on the issue, it seems logical that the second iteration of the child protection tool could address some of the concerns.
Please follow me on Indonesiaor join me at the AppleHolic bar & grill and Apple Discussion group on MeWe.
Copyright © 2021 IDG Communications, Inc.