Apple shifts stance on CSAM scanning following widespread criticism

The tech giant will now only flag images that had been supplied by clearinghouses in multiple countries

Apple has provided further details concerning its child sexual abuse material (CSAM) scanning technology in its fourth follow-up briefing since its initial announcement ten days ago.

The tech giant will now only flag images that had been supplied by clearinghouses in multiple countries and not just by the US National Center for Missing and Exploited Children (NCMEC), as announced earlier.

In a change of stance, Apple also decided to publicly define a threshold for the number of CSAM images identified for law enforcement to be potentially alerted. The tech giant has announced that it will take 30 matches for the system to launch a human review which, if proven legitimate, will lead to authorities being notified about the presence of CSAM in a person’s iCloud library.

“We expect to choose an initial match threshold of 30 images,” Apple said in a Security Threat Model Review published late last week.

“Since this initial threshold contains a drastic safety margin reflecting a worst-case assumption about real-world performance, we may change the threshold after continued empirical evaluation of NeuralHash false positive rates – but the match threshold will never be lower than what is required to produce a one-in-one trillion false positive rate for any given account.”

Since Apple's initial announcement on 6 August, it has garnered substantial criticism from customers, privacy advocates, and even Apple employees.

Related Resource

Don’t just educate: Create cyber-safe behaviour

Designing effective security awareness and training programmes

How to define effective security awareness and training programmesDownload now

Last week, it was reported that the tech giant’s internal Slack channel had been flooded with more than 800 complaints about the technology, with many complaining that the move will sabotage Apple’s privacy-respecting reputation. Others have defended the tech, which ultimately aims to preserve the safety of minors and lead to the arrest of child sexual abuse offenders.

Privacy advocates have criticised the tech giant for deciding to roll out technology that could potentially be abused by authoritarian states to silence political opponents, journalists, and human rights campaigners. Apple responded by maintaining that the technology would not scan user’s iCloud uploads for anything other than CSAM, adding that it would reject governmental requests to "add non-CSAM images to the hash list".

Featured Resources

The definitive guide to warehouse efficiency

Get your free guide to creating efficiencies in the warehouse

Free download

The total economic impact™ of Datto

Cost savings and business benefits of using Datto Integrated Solutions

Download now

Three-step guide to modern customer experience

Support the critical role CX plays in your business

Free download

Ransomware report

The global state of the channel

Download now

Recommended

Researchers disclose top flaws abused by ransomware gangs
ransomware

Researchers disclose top flaws abused by ransomware gangs

20 Sep 2021
Best MDM solutions 2020
mobile device management (MDM)

Best MDM solutions 2020

17 Sep 2021
How do hackers choose their targets?
hacking

How do hackers choose their targets?

17 Sep 2021
Owner of DDoS for hire sites found guilty of hacking offences
distributed denial of service (DDOS)

Owner of DDoS for hire sites found guilty of hacking offences

17 Sep 2021

Most Popular

What are the pros and cons of AI?
machine learning

What are the pros and cons of AI?

8 Sep 2021
The technology powering the future of shopping
Technology

The technology powering the future of shopping

16 Sep 2021
Citrix mulling potential sale after tumultuous 2021
mergers and acquisitions

Citrix mulling potential sale after tumultuous 2021

15 Sep 2021