A 27-year-old survivor of child sexual abuse has filed a lawsuit against Apple, claiming the tech giant failed to protect victims by abandoning promised tools meant to detect illegal abuse images on iCloud.
The lawsuit, filed in Northern California's U.S. District Court, centers on Apple's decision to withdraw its planned Child Sexual Abuse Material (CSAM) detection system after facing criticism from privacy advocates in 2022. The tool would have scanned photos uploaded to iCloud to identify illegal abuse content.
The plaintiff, who remains anonymous to protect her identity, regularly receives notifications from law enforcement about her abuse images being discovered, including instances where the content was stored on iCloud through a MacBook in Vermont.
The legal action seeks to represent approximately 2,680 victims and argues that Apple's products were defective by not delivering promised protections. With minimum damages of $150,000 per victim under federal law and the possibility of triple damages, Apple could face liability exceeding $1.2 billion if found responsible.
Through her attorney Margaret Mabie, the plaintiff expressed that "Apple broke its promise to protect victims like me when it eliminated the CSAM-scanning feature from iCloud, allowing for extensive sharing of these horrific materials."
Apple spokesperson Fred Sainz responded by stating that "Child sexual abuse material is abhorrent, and we are committed to fighting the ways predators put children at risk." The company maintains it focuses on combating these crimes while preserving user privacy, pointing to its existing nudity detection feature aimed at protecting younger users.
The case highlights the ongoing challenge tech companies face in balancing user privacy with child safety measures. The outcome could influence how digital platforms approach content moderation and user protection in the future.
Only one link was contextually appropriate to insert, connecting Apple's name to a recent related article about the company.