Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM).
However, false positives from scans also have the possibility to destroy lives. While I wouldn't cry about Apple losing millions in false-postive related lawsuits, it's simply not a good thing in this case.