Apple delays controversial CSAM detection feature

Apple logo on the side of a building
(Image credit: Shutterstock)

Apple has delayed its controversial image scanning feature following negative feedback.

The company updated its briefing page on the technology, explaining it was delaying the feature based on the response it received from customers and privacy advocates.

"Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features," the company said.

The feature, announced in August, would have allowed Apple to scan photos uploaded to iCloud. The photos would be scanned on user devices and compared against a set of hashes known as Child Sexual Abuse Material (CSAM) images.

This database was originally supposed to come from the National Center for Missing and Exploited Children (NCMEC), but Apple later explained it would only scan images that also appeared in clearing houses across multiple countries.

Apple also said it would only trigger a human review if it found 30 CSAM matches using this method.

The company also announced a second feature that would scan images sent to children in its iMessage app to detect nudes and notify parents.

Both plans provoked concerns from global privacy groups. They were concerned Apple's scanning technology could be used to scan for other kinds of imagery, opening users up to government surveillance. The company vowed not to allow government requests for expanded searches.

RELATED RESOURCE

The IT expert’s guide to AI and content management

How artificial intelligence and machine learning could be critical to your business

FREE DOWNLOAD

This week, the Electronic Frontier Foundation delivered a petition protesting the technology, which was slated to be included in the next version of iOS and initially restricted to US users.

The organization pointed out that the technology breaks the end-to-end encryption functionality Apple has touted in its operating systems.

"Apple’s surveillance plans don’t account for abusive parents, much less authoritarian governments that will push to expand it. Don’t let Apple betray its users," it added.

Danny Bradbury

Danny Bradbury has been a print journalist specialising in technology since 1989 and a freelance writer since 1994. He has written for national publications on both sides of the Atlantic and has won awards for his investigative cybersecurity journalism work and his arts and culture writing. 

Danny writes about many different technology issues for audiences ranging from consumers through to software developers and CIOs. He also ghostwrites articles for many C-suite business executives in the technology sector and has worked as a presenter for multiple webinars and podcasts.