Skip Navigation

Apple sued over abandoning CSAM detection for iCloud

techcrunch.com Apple sued over abandoning CSAM detection for iCloud | TechCrunch

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit

Apple sued over abandoning CSAM detection for iCloud | TechCrunch

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).

1
1 comments
  • Yes, CSAM is bad.

    However, false positives from scans also have the possibility to destroy lives. While I wouldn't cry about Apple losing millions in false-postive related lawsuits, it's simply not a good thing in this case.