There has been a ton of CSAM and CP arrests in the US lately, especially from cops and teachers, along with at least one female teacher seducing boys as young as 12. I cannot understand the attraction to kids. Even teens. Do these people think they are having a relationship, or it is somehow okay to take away another human beings' innocence? Is it about sex, power, or WTH is it? Should AI generated CSAM and CP be treated the same as a real person since it promotes the same issues? I am a grandfather, and I am worried about how far this will go with the new AI being able to put anyone's face into a Porno movie too.
It seems to me that a whole new set of worldwide guidelines and laws need to be put into effect asap.
How difficult would it be for AI photo apps to filter out words, so someone cannot make anyone naked?
AI CSAM should absolutely be treated as such. The model has been trained on images of real human children. I’m not sure where the issue comes from I would imagine power. Id need to check peer reviewed work from those in the field but I honestly can’t stomach it.
It used to be nothing for parents to take pictures of their kids playing in the bath. Parents have been convicted and lost their children for it, though.
I am not an expert in any field relating to any of this by any means, but we can all agree that CSAM is unequivocally reprehensible. Thusly many people will have severe issues with anything that normalizes it even remotely. That would be my knee jerk response anyway.
Well maybe we shouldn’t base our decisions on knee jerk responses.
Imo if nobody’s being hurt then it’s none of our business. If it helps these people to deal with their urges without actually hurting anyone then I think that’s unquestionably a good thing.
If it is in fact helping them, yes. It would be ideal to do a study of how it affects their self control before going that direction though I think, as some argue it would do the opposite.
I don’t have enough information to have an opinion and I do agree with you that knee jerk reactions are not ideal. But choosing to allow it (at a time when AI generated media is starting to be regulated) is also a decision.
It almost certainly "helps" as many of these people as it encourages. The hedonistic effect is a phenomenon common to all humans, where a person indulging heavily in something that makes them feel good needs more and more extreme examples of it to maintain the baseline of satisfaction from it. Any harmful compulsion when indulged will fall victim to this effect.
Providing virtual explicit images of children might mollify some, but it will have an inflaming effect on just as many others, who will seek out increasingly realistic or visceral imagery, up to and including looking for real photos and/or exploiting real children. That in turn ensures a market for child exploitation.
Yes, but it's wrong for very different reasons and severities. Murder vs murder porn, if you will. Both are bad and gross, but different, and that matters.
But that's irrelevant to my question, which no one actually answered.
I am curious about people's take on the difference between human creativity from memory vs AI "creativity" from training. The porn aspect is only relevant in that it's an edge case that makes the debate meaningful.
There are laws today that you can't copyright AI art, but we can copyright art that's based on a person's combined experiences. That seems arbitrary to me, and I'm trying to understand better.
If the artist is drawing naked children that isn’t for the sake of a book or something of similar nature there is a problem. This is also a disingenuous comparison an artist hasn’t been trained on hundreds to millions of children’s images and then fine tuned. There’s a lot of illegal content these models come across and then are hopefully tuned by human hands. So try another example