Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.
The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.
In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.
If the other admins want to give their opinions about this, then I am all ears.
I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.
This post seems highly reactive to me. I'm sorry to hear of you being exposed to such disturbing material, but I fail to see at true connection of that happening and using Lemmy as the platform. I absolutely agree that nobody should have to experience what you did, but I disagree with the platform change proposition.
I don't know of any software platform where that would not happen.
Even with a text-only platform people can still post URLs to unsafe content.
I think OP is referring to some kind of automated scanner but I'm not sure there are publicly available ones. I guess using them would come at a cost - either computational or $$. And even so, there can be false positives so you would probably still have to check the report anyway someday.