Apple cautions against Australian plan for cloud scanning
3 min readTech giant warns that scanning for child abuse material would compromise user privacy and safety
Apple cautioned against an Australian proposal requiring tech firms to scan cloud and messaging services for child abuse material, citing concerns about jeopardizing privacy and security safeguards. The tech giant warned that such measures could potentially result in global mass surveillance.
The proposal, outlined in two mandatory standards for child safety released by the eSafety commissioner, Julie Inman Grant, suggests that providers should detect and eliminate child abuse material and pro-terror material “where technically feasible,” in addition to disrupting and deterring new material of this nature.
The regulator emphasized in a related discussion paper that it does not support creating vulnerabilities or backdoors to compromise privacy and security on end-to-end encrypted services.
In its response to the proposals, shared with Guardian Australia, Apple argued that the lack of explicit assurances in the draft standards would not provide any protection.
“eSafety asserts that the same protections for end-to-end encryption in the codes apply to the standards, but this is not backed by any language to that effect,” the submission stated.
“We suggest that eSafety should adopt a clear and consistent approach by expressly endorsing end-to-end encryption to avoid any uncertainty, confusion, or potential inconsistency across codes and standards.”
The company also raised concerns that the definition of “technically feasible” was too limited, concentrating solely on the cost of developing a new system, rather than considering “whether a particular product design change is in the best interests of securing its users.”
Privacy advocates and the encrypted messaging company Signal have echoed the concerns raised by the Cupertino-based company. Signal has indicated that it will legally challenge the standards if compelled to weaken encryption.
Apple further cautioned that mandating technology to scan cloud services for known child abuse material would jeopardize the privacy and safety of all users.
“Scanning for specific content creates the potential for widespread surveillance of communications and storage systems containing data related to the most confidential matters of numerous Australians,” Apple stated.
History demonstrates that such capabilities will inevitably broaden to encompass other content types (e.g., images, videos, text, or audio) and content categories.
Apple cautioned that these surveillance tools could be repurposed to search for additional content, including an individual’s political, religious, health, sexual, or reproductive activities.
“Mass surveillance tools have far-reaching adverse effects on freedom of opinion and expression and, consequently, democracy as a whole.
The company also indicated that scanning individuals’ files and messages could result in law enforcement bypassing legal procedures. Enforcing the implementation of such measures on tech firms would, it stated, “have significant global ramifications.”
Apple expressed concerns that countries without the strong legal protections available to Australians would exploit and build upon these measures.
Erik Neuenschwander, Apple’s director of user privacy and child safety, stated that tech companies should enhance protections and reduce vulnerabilities. He highlighted that the absence of safeguards for encryption and the limited definition of technical feasibility could introduce weaknesses into systems.
Neuenschwander emphasized that scanning user data was a “broad-reaching mandate” that would necessitate companies to have access to all data in a readable format for various purposes.
This could encompass a range of scenarios, from the company’s own processing to law enforcement requests, to potential attackers gaining unauthorized access to the systems and obtaining that data unlawfully. This is part of our apprehension regarding the lack of backing for encryption.
The company highlighted its numerous parental control features as part of its efforts to enhance child safety.
Unlike in the UK, where a similar online safety law was proposed—ultimately shelved—last year, Apple has not gone as far as to threaten to withdraw iMessage or iCloud from Australia.
Inman Grant informed Senate estimates last week that the 50 submissions received during the consultation on the proposal raised “a lot of technical issues [and] a lot of good feedback.”
“We will integrate what we can and what we believe enhances clarity,” she stated.
Additional submissions were anticipated to be released this month, with the finalized standards likely to be presented to parliament for approval by May.