IT Pro is supported by its audience. When you purchase through links on our site, we may earn an affiliate commission. Learn more

Apple says it will reject government requests to use child sexual abuse image scanning for surveillance

The tech giant claims it has always "steadfastly refused" to work with agency requests that potentially erode privacy

Apple has responded to the most pressing criticisms surrounding its decision to scan US iPhone photo libraries for known images of child sexual abuse material (CSAM), which was announced late last week.

The tech giant maintained that the technology would not scan user’s iCloud uploads for anything other than CSAM, and that it would reject governmental requests to "add non-CSAM images to the hash list".

In a FAQ response document, Apple said that its “CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC [National Center for Missing & Exploited Children] and other child safety groups”.

“We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it,” the tech giant stated.

The response comes following concerns that the technology could be exploited by governments to increase surveillance, with many pointing to examples of tech companies working with authoritarian governments. For example, Microsoft, Google, and Qualcomm have all accepted demands from the Chinese state for user information.

Related Resource

Don’t just educate: Create cyber-safe behaviour

Designing effective security awareness and training programmes

How to define effective security awareness and training programmesDownload now

Apple also has a history of working with government agencies in the US. In the first half of 2019, the tech giant had received a record-high 3,619 requests from the US government, seeking user account information to support law enforcement investigations. Reports show Apple complied with 90% of these requests.

However, by mid-2020, public sector cooperation had become less lucrative and faced higher public scrutiny, leading to tech giants such as IBM and Amazon cutting ties, at least temporarily, with the US law enforcement.

On Monday, Apple maintained that, due to the hash technology used to identify the CSAM images, it will be impossible to carry out “targeted attacks against only specific individuals” in order to frame someone. The tech giant also added that it would conduct “human review before making a report to NCMEC”.

“In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC,” it stated.

Featured Resources

Accelerating AI modernisation with data infrastructure

Generate business value from your AI initiatives

Free Download

Recommendations for managing AI risks

Integrate your external AI tool findings into your broader security programs

Free Download

Modernise your legacy databases in the cloud

An introduction to cloud databases

Free Download

Powering through to innovation

IT agility drive digital transformation

Free Download

Most Popular

FCC commissioner urges Apple and Google to remove TikTok from app stores
data protection

FCC commissioner urges Apple and Google to remove TikTok from app stores

29 Jun 2022
LockBit 2.0 ransomware disguised as PDFs distributed in email attacks
Security

LockBit 2.0 ransomware disguised as PDFs distributed in email attacks

27 Jun 2022
Former Uber security chief to face fraud charges over hack coverup
data breaches

Former Uber security chief to face fraud charges over hack coverup

29 Jun 2022