Apple Sued by West Virginia for Allegedly Allowing CSAM Distribution Through iCloud
West Virginia’s Attorney General JB McCuskey today announced a lawsuit against Apple, accusing the company of knowingly allowing iCloud to be used to distribute and store child sexual abuse material (CSAM). McCuskey says that Apple has opted to “do nothing about it” for years.
“Preserving the privacy of child predators is absolutely inexcusable. And more importantly, it violates West Virginia law. Since Apple has so far refused to police themselves and do the morally right thing, I am filing this lawsuit to demand Apple follow the law, report these images, and stop re-victimizing children by allowing these images to be stored and shared,” Attorney General JB McCuskey said.
According to the lawsuit [PDF], Apple has described itself as the “greatest platform for distributing child porn” internally, but it submits far fewer reports about CSAM than peers like Google and Meta.
Back in 2021, Apple announced new child safety features, including a system that would detect known CSAM in images stored in iCloud Photos. After backlash from customers, digital rights groups, child safety advocates, and security researchers, Apple decided to abandon its plans for CSAM detection in iCloud Photos.
“Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all,” Apple said when announcing that it would not implement the feature.
Apple later explained that creating a tool for scanning private iCloud data would “create new threat vectors for data thieves to find and exploit.”
West Virginia’s Attorney General says that Apple has shirked its responsibility to protect children under the guise of user privacy, and that Apple’s decision not to deploy detection technology is a choice, not passive oversight. The lawsuit suggests that since Apple has end-to-end control over hardware, software, and cloud infrastructure, it is not able to claim to be an “unknowing, passive conduit of CSAM.”
The lawsuit is seeking punitive damages and injunctive relief requiring Apple to implement effective CSAM detection measures.
Apple was also sued in 2024 over its decision to abandon CSAM detection. A lawsuit representing a potential group of 2,680 victims said that Apple’s failure to implement CSAM monitoring tools has caused ongoing harm to victims. That lawsuit is seeking $1.2 billion.Tag: Apple LawsuitsThis article, “Apple Sued by West Virginia for Allegedly Allowing CSAM Distribution Through iCloud” first appeared on MacRumors.comDiscuss this article in our forumsMacRumors: Mac News and Rumors – Front PageRead More