Technology

Apple Sued for Knowingly Hosting Child Sexual Abuse Material on Its Products, Failing to Protect Survivors

Published

on

Landmark lawsuit brought on behalf of thousands of survivors of child sexual abuse images and videos (CSAM) traded on Apple platforms

CUPERTINO, Calif., Dec. 8, 2024 /PRNewswire/ — This weekend, a class action lawsuit was filed against Apple on behalf of thousands of survivors of child sexual abuse for knowingly allowing the storage of images and videos documenting their abuse on iCloud and the company’s defectively designed products. The lawsuit alleges that Apple has known about this content for years, but has refused to act to detect or remove it, despite developing advanced technology to do so.

The plaintiffs are being represented by Marsh Law Firm. Additionally, Heat Initiative is providing some support for this lawsuit as part of the organization’s broader Ignite program, which provides legal and advocacy support for victims of child sexual abuse through referrals, research, and funding to empower them to use their voices and hold technology companies accountable.

The images and videos of the plaintiffs’ childhood sexual abuse, which have been stored thousands of times, would have been identified and removed had Apple implemented its 2021 “CSAM Detection” technology. However, Apple terminated the program after its announcement. Other leading technology providers have been proactively detecting and reporting illegal child sex abuse images and videos for more than a decade. Apple’s belated efforts, and subsequent cancellation, leave it among the very few major platforms that do not engage in proactive detection and removal.

The full complaint, as well as a fact sheet and other supporting materials, can be found HERE.

 “The knowledge that images of my abuse are still out there is a never-ending nightmare – Apple could have stopped this, but has chosen not to act,” said Jane Doe, a plaintiff in the lawsuit. “Apple has the technology to stop this from continuing, yet they knowingly turn a blind eye. This isn’t just about my story –  it’s about standing up for every survivor who deserves safety and dignity. Apple has a responsibility to protect us, and I’m here to demand that they fulfill it.”

“Today, thousands of brave survivors are coming forward to demand accountability from one of the most successful technology companies on the planet. Apple has not only rejected helping these victims, it has advertised the fact that it does not detect child sex abuse material on its platform or devices thereby exponentially increasing the ongoing harm caused to these victims,” said Margaret E. Mabie, Partner at Marsh Law Firm, representing the plaintiffs. “Our clients have endured unimaginable abuse, and yet Apple’s top executives continue to ignore their pleas, fully aware that this illegal contraband remains on their platform. By abandoning their state-of-the-art detection program without offering an alternative, Apple has chosen to prioritize its own corporate agenda over the lives and dignity of survivors. This lawsuit is a call for justice and a demand for Apple to finally take responsibility and protect these victims.”

“Apple wants people to think they are the ‘responsible’ tech company, and this lawsuit demonstrates clearly that, on this issue, they are not,” said Sarah Gardner, Founder and CEO of the Heat Initiative, an organization dedicated to encouraging leading technology companies to combat child sex abuse on their platforms. “The plaintiffs and countless other survivors of child sexual abuse are forced to relive the worst moments imaginable because Apple refuses to implement common sense practices that are standard across the tech industry. They will argue that this is a privacy issue, but they are failing to acknowledge the privacy and basic humanity of the children being raped and sexually assaulted in the videos and images Apple stores on iCloud.”

In August 2021, Apple announced it would implement a new “CSAM Detection” feature, which would have identified known child sexual abuse material in iCloud using NeuralHash, a type of hashing technology that Apple developed. However, after the program was announced, Apple executives reversed their decision and ultimately killed the implementation of the program. At the same time, in 2023, five major tech companies collectively reported more than 32 million pieces of child sexual abuse images and videos on their platforms–Apple reported only 267.

The lawsuit is seeking injunctive relief for Apple to implement basic child safety measures on behalf of the plaintiffs. The claim of negligence and failing to fulfill their duty of care resulting in harms to the plaintiffs stems from two main factors:

Apple’s failure to implement any plan to detect illegal child sexual abuse images and videos when they have full knowledge of its existence on their platform violates their duty of care.If the “CSAM Detection” feature had been implemented, the plaintiffs’ illegal child sexual abuse images would have been detected and removed. Instead they remain on iCloud and devices today, continuing to cause the plaintiffs harm.

Marsh Law Firm focuses its legal practice exclusively on representing survivors of sexual abuse and online exploitation. They are a survivor-focused, trauma-informed, and justice-oriented law firm that advocates for clients both in and out of the courtroom to secure justice and hold perpetrators and the institutions that enable abuse accountable.

Heat Initiative is a collective effort of concerned child safety experts and advocates encouraging leading technology companies to combat child sexual abuse on their platforms. Heat Initiative sees a future where children’s safety is at the forefront of any existing and future technological developments. The Heat Initiative’s Ignite program catalyzes impact litigation to hold technology companies accountable to their duty to prevent and address the sexual exploitation of children on their platforms.

Contact: press@heatinitiative.org

View original content:https://www.prnewswire.com/news-releases/apple-sued-for-knowingly-hosting-child-sexual-abuse-material-on-its-products-failing-to-protect-survivors-302325571.html

SOURCE Heat Initiative

Leave a Reply

Your email address will not be published. Required fields are marked *

Trending

Exit mobile version