Skip to main content

AppleApple abandons CSAM detection tool for iCloud Photos after privacy backlash

After announcing plans to scan iCloud photos for child sexual abuse material, Apple reversed course following criticism from privacy advocates and security experts who warned the system could be abused by governments. Child safety groups accused Apple of prioritizing privacy over child protection.

Scoring Impact

TopicDirectionRelevanceContribution
Child Safety-againstprimary-1.00
User Privacy+towardsecondary+0.50
Overall incident score =-0.166

Score = avg(topic contributions) × significance (high ×1.5) × confidence (0.59)× agency (reactive ×0.75)

Evidence (1 signal)

Confirms product_decision Dec 7, 2022 verified

Apple abandons CSAM detection tool for iCloud Photos after privacy backlash

After announcing plans to scan iCloud photos for child sexual abuse material, Apple reversed course following criticism from privacy advocates and security experts who warned the system could be abused by governments. Child safety groups accused Apple of prioritizing privacy over child protection.

Related: Same Topics