A 27-year-old woman, victim of sexual abuse as a child, has just filed a lawsuit against Apple. The tech giant is accused of failing in its promise to protect victims by abandoning its child pornography detection tool, leaving proliferate illegal content on its servers. The company, already shaken by several trials this year (the Spotify affair and the hefty fine punishing the operation of the App Store) finds itself this time in a completely different legal dimension.
iCloud, refuge for illegal images
The story begins when the plaintiff was born, a victim of abuse perpetrated by a family member who photographed her actions and shared the photos online. The man was arrested after offering these images in a chat room, then sentenced to prison. However, the victim’s ordeal does not end there. For over a decade, she has received daily notifications informing her that his photos were discovered on new devices thanks to iCloud.
In 2021, a new report informs him that his images were found on the MacBook of a man in Vermont, also stored on iCloud. This discovery came shortly after Apple announced and then abandoned NeuralHash, its child pornography image detection system. The latter had to scan the photos stored on iCloud and compare them to a database of images referenced by the National Center for Missing & Exploited Children (NCMEC). An American organization that fights against disappearances and sexual exploitation of children, guaranteeing assistance to families, collaborating with the police and raising public awareness.
However, cybersecurity specialists then raised the specter of the potential exploitation of this technology for state surveillance purposes. Apple therefore suspended the deployment of NeuralHashconsidering it impossible to reconcile the systematic analysis of content with the absolute preservation of user confidentiality.
Thousands of alleged victims
Marsh Law, with 17 years of experience defending victims of sexual abuse, is leading the charge. Firm partner Margaret E. Mabie has documented more than 80 cases involving Apple products, including that of a San Francisco-area man who owned over 2,000 illegal images and videos on iCloud.
The figures are not in favor of the apple brand: while Google and Facebook each report more than a million cases to NCMEC, Apple only declares 267. This disparity had also alerted Eric Friedman, Apple executive responsible for fraud protection, who admitted in 2020: “ We are the largest distribution platform for child pornography content “. A shocking statement which indicates that data protection therefore takes precedence over security at Apple.
The suit, filed in federal court in Northern California, could represent 2,680 victims. With minimum damages of $150,000 per victim, potentially tripled, the bill could exceed $1.2 billion if Apple is guilty. Apple spokesperson Fred Sainz defended the company’s position as follows: We strongly condemn content regarding child sexual abuse and are committed to fighting predators who prey on children. We are constantly developing new solutions to combat these crimes, while respecting the privacy and security of all our users. “.
For the plaintiff, who lives in the Northeast of the United States, this legal action represents a considerable personal risk since the media coverage of the trial could provoke a resurgence in the sharing of his images. Nevertheless, she persists, believing that Apple must take responsibility after giving victims false hope by introducing and then abandoning NeuralHash. This affair, if proven, could force Apple to fundamentally rethink its approach to the balance between protecting privacy and combating illegal content.
- A woman is suing Apple for abandoning its NeuralHash tool, allowing sexual abuse images to proliferate on iCloud.
- Apple has reported far less illegal content than Google or Facebook, drawing criticism over its priorities between privacy and security.
- The lawsuit could represent 2,680 victims and cost Apple more than $1.2 billion in damages.