Apple says it will reject government requests to use child sexual abuse image scanning for surveillance

The tech giant claims it has always "steadfastly refused" to work with agency requests that potentially erode privacy

Apple has responded to the most pressing criticisms surrounding its decision to scan US iPhone photo libraries for known images of child sexual abuse material (CSAM), which was announced late last week.

The tech giant maintained that the technology would not scan user’s iCloud uploads for anything other than CSAM, and that it would reject governmental requests to "add non-CSAM images to the hash list".

In a FAQ response document, Apple said that its “CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC [National Center for Missing & Exploited Children] and other child safety groups”.

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it,” the tech giant stated.

The response comes following concerns that the technology could be exploited by governments to increase surveillance, with many pointing to examples of tech companies working with authoritarian governments. For example, Microsoft, Google, and Qualcomm have all accepted demands from the Chinese state for user information.

Related Resource

Don’t just educate: Create cyber-safe behaviour

Designing effective security awareness and training programmes

How to define effective security awareness and training programmesDownload now

Apple also has a history of working with government agencies in the US. In the first half of 2019, the tech giant had received a record-high 3,619 requests from the US government, seeking user account information to support law enforcement investigations. Reports show Apple complied with 90% of these requests.

However, by mid-2020, public sector cooperation had become less lucrative and faced higher public scrutiny, leading to tech giants such as IBM and Amazon cutting ties, at least temporarily, with the US law enforcement.

On Monday, Apple maintained that, due to the hash technology used to identify the CSAM images, it will be impossible to carry out “targeted attacks against only specific individuals” in order to frame someone. The tech giant also added that it would conduct “human review before making a report to NCMEC”.

“In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC,” it stated.

Featured Resources

The definitive guide to warehouse efficiency

Get your free guide to creating efficiencies in the warehouse

Free download

The total economic impact™ of Datto

Cost savings and business benefits of using Datto Integrated Solutions

Download now

Three-step guide to modern customer experience

Support the critical role CX plays in your business

Free download

Ransomware report

The global state of the channel

Download now

Most Popular

What are the pros and cons of AI?
machine learning

What are the pros and cons of AI?

8 Sep 2021
Google takes down map showing homes of 111,000 Guntrader customers
data breaches

Google takes down map showing homes of 111,000 Guntrader customers

2 Sep 2021
Intuit plans end-to-end SMB platform after $12 billion Mailchimp acquisition
mergers and acquisitions

Intuit plans end-to-end SMB platform after $12 billion Mailchimp acquisition

14 Sep 2021