West Virginia is suing Apple alleging negligence over CSAM materials
The office of the Attorney General for West Virginia announced Thursday that it has filed a lawsuit against Apple alleging that the company had "knowingly" allowed its iCloud platform "to be used as a vehicle for distributing and storing child sexual abuse material." The state alleges this went on for years but drew no action from the tech giant "under the guise of user privacy." In the lawsuit, the state repeatedly cites a text from Apple executive Eric Friedman, in which he calls iCloud "the greatest platform for distributing child porn" in a conversation with another Apple executive. These messages were first uncovered by The Verge in 2021 within discovery documents for the Epic Games v. Apple trial. In the conversation, Friedman says while some other platforms prioritize safety over privacy, Apple's priorities "are the inverse." The state further alleges that detection technology to help root out and report CSAM exists, but that Apple chooses not to implement it. Apple indeed consi
The office of the Attorney General for West Virginia announced Thursday that it has filed a lawsuit against Apple alleging that the company had "knowingly" allowed its iCloud platform "to be used as a vehicle for distributing and storing child sexual abuse material." The state alleges this went on for years but drew no action from the tech giant "under the guise of user privacy."
In the lawsuit, the state repeatedly cites a text from Apple executive Eric Friedman, in which he calls iCloud "the greatest platform for distributing child porn" in a conversation with another Apple executive. These messages were first uncovered by The Verge in 2021 within discovery documents for the Epic Games v. Apple trial. In the conversation, Friedman says while some other platforms prioritize safety over privacy, Apple's priorities "are the inverse."
The state further alleges that detection technology to help root out and report CSAM exists, but that Apple chooses not to implement it. Apple indeed considered scanning iCloud Photos for CSAM in 2021, but abandoned these plans after pushback stemming from privacy concerns.
In 2024 Apple was sued by a group of over 2,500 victims of child sexual abuse, citing nearly identical claims and alleging that Apple's failure to implement these features led to the victims' harm as images of them circulated through the company's servers. At the time Apple told Engadget, “child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users."
The case in West Virginia would mark the first time a governmental body is bringing such an action against the iPhone maker. The state says it is seeking injunctive relief that would compel Apple to implement effective CSAM detection measures as well as damages. We have reached out to Apple for comment on the suit and will update if we hear back.This article originally appeared on Engadget at https://www.engadget.com/big-tech/west-virginia-is-suing-apple-alleging-negligence-over-csam-materials-164647648.html?src=rss
Share
What's Your Reaction?
Like
0
Dislike
0
Love
0
Funny
0
Angry
0
Sad
0
Wow
0
