A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography.
A Florida man is facing 20 counts of obscenity for allegedly creating and distributing AI-generated child pornography, highlighting the danger and ubiquity of generative AI being used for nefarious reasons.
Phillip Michael McCorkle was arrested last week while he was working at a movie theater in Vero Beach, Florida, according to TV station CBS 12 News. A crew from the TV station captured the arrest, which made for dramatic video footage due to law enforcement leading away the uniform-wearing McCorkle from the theater in handcuffs.
There is a difference between something immediately identifiable as a drawing and something almost photorealistic. If a generated image is indistinguishable from a real photo, it should be treated the same.
I don't advocate for either but it should NOT be treated the same. one doesn't involve a child being involved and traumatized, id rather a necrophiliac make ai generated pics instead of... you know.
Not a great comparison, because unlike withh violent games or movies, you can't say that there is no danger to anyone in allowing these images to be created or distributed. If they are indistinguishable from the real thing, it then becomes impossible to identify actual human victims.
There's also a strong argument that the availability of imagery like this only encourages behavioral escalation in people who suffer from the affliction of being a sick fucking pervert pedophile. It's not methadone for them, as some would argue. It's just fueling their addiction, not replacing it.
Well, the image generator had to be trained on something first in order to spit out child porn. While it may be that the training set was solely drawn/rendered images, we don't know that, and even if the output were in that style, it might very well be photorealistic images generated from real child porn and run through a filter.
Wild corn dogs are an outright plague where I live. When I was younger, me and my buddies would lay snares to catch to corn dogs. When we caught one, we'd roast it over a fire to make popcorn. Corn dog cutlets served with popcorn from the same corn dog is popular meal, especially among the less fortunate. Even though some of the affluent consider it the equivalent to eating rat meat. When me pa got me first rifle when I turned 14, I spent a few days just shooting corn dogs.
Yes exactly. That people are then excusing this with "well it was trained on all.public images," are just admitting you're right and that there is a level of harm here since real materials are used. Even if they weren't being used or if it was just a cartoon, the morality is still shaky because of the role porn plays in advertising. We already have laws about advertising because it's so effective, including around cigarettes and prescriptions. Most porn, ESPECIALLY FREE PORN, is an ad to get you to buy other services. CP is not excluded from this rule - no one gets free lunch, so to speak. These materials are made and hosted for a reason.
The role that CP plays in most countries is difficult. It is used for blackmail. It is also used to generate money for countries (intelligence groups around the world host illegal porn ostensibly "to catch a predator," but then why is it morally okay for them to distribute these images but no one else?). And it's used as advertising for actual human trafficking organizations. And similar organizations exist for snuff and gore btw. And ofc animals. And any combination of those 3. Or did you all forget about those monkey torture videos, or the orangutan who was being sex trafficked? Or Daisy's Destruction and Peter Scully?
So it's important to not allow these advertisers to combine their most famous monkey torture video with enough AI that they can say it's AI generated, but it's really just an ad for their monkey torture productions. And even if NONE of the footage was from illegal or similar events and was 100% thought of by AI - it can still be used as an ad for these groups if they host it. Cartoons can be ads ofc.
How was the model trained? Probably using existing CSAM images. Those children are victims. Making derivative images of “imaginary” children doesn’t negate its exploitation of children all the way down.
So no, you are making false equivalence with your video game metaphors.
A generative AI model doesn't require the exact thing it creates in its datasets. It most likely just combined regular nudity with a picture of a child.
That's not really a nuanced take on what is going on. A bunch of images of children are studied so that the AI can learn how to draw children in general. The more children in the dataset, the less any one of them influences or resembles the output.
Ironically, you might have to train an AI specifically on CSAM in order for it to identify the kinds of images it should not produce.
That's a whole other thing than the AI model being trained on CSAM. I'm currently neutral on this topic so I'd recommend you replying to the main thread.
It's not CSAM in the training dataset, it's just pictures of children/people that are already publicly available. This goes on to the copyright side of things of AI instead of illegal training material.
It’s images of children used to make CSAM. No matter of your mental gymnastics can change that, nor the fact that those children’s consent was not obtained.
Why are you trying so hard to rationalize the creation of CSAM? Do you actually believe there is a context in which CSAM is OK? Are you that sick and perverted?
Because it really sounds like that’s what you’re trying to say, using copyright law as an excuse.
It's every time with you people, you can't have a discussion without accusing someone of being a pedo. If that's your go-to that says a lot about how weak your argument is or what your motivations are.
You're just projecting your unwillingness to ever take a stance that doesn't personally benefit you.
Some people can think about things objectively and draw a conclusion that makes sense to them without personal benefit being a primary determinant of said conclusion.
I'm not neutral about child porn, I'm very much against it, stop trying to put words in my mouth. I'm talking about this kind of use of AI could be in the very same category of loli imagery, since these are not real child sexual abuse material.
I just hope that the Models aren't trained on CSAM. Making generating stuff they can fap on ""ethical reasonable"" as no children would be involved. And I hope that those who have those tendancies can be helped one way or another that doesn't involve chemical castration or incarceration.
Fantasising about sexual contact with children indicates that this person might groom children for real, because they have a sexual interest in doing so. As someone who was sexually assaulted as a child, it's really not something that needs to happen.
Seems like then fantasizing about shooting people or carjacking or such indcates that person might do that activity for real to. There are a lot of car jackings nowadays and you know gta is real popular. mmmm. /s but seriously im not sure your first statement has merit. Especially when you look at where to draw the line. anime. manga. oil paintings. books. thoughts in ones head.
If you're asking whether anime, manga, oil paintings, and books glorifying the sexualization of children should also be banned, well, yes.
This is not comparable to glorifying violence, because real children are victimized in order to create some of these images, and the fact that it's impossible to tell makes it even more imperative that all such imagery is banned, because the existence of fakes makes it even harder to identify real victims.
It's like you know there's an armed bomb on a street, but somebody else filled the street with fake bombs, because they get off on it or whatever. Maybe you'd say making fake bombs shouldn't be illegal because they can't harm anyone. But now suddenly they have made the job of law enforcement exponentially more difficult.
Sucks to be law enforcement then. I'm not giving up my rights to make their jobs easier. I hate hate HATE the trend towards loss of privacy and the "if you didn't do anything wrong then you have nothing to hide" mindset. Fuck that.
If you want to keep people who fantasise about sexually exploiting children around your family, be my guest. My family tried that, and I was raped. I didn't like that, and I have drawn my own conclusions.
yeah and if you want to keep people who fantasize about murdering folk. you can't say one thing is a thing without saying the other is. Im sorry you were raped but I doubt it would be stopped by banning lolita.
You can call it a strawman but doing something evil if its killing folks or raping folks the effect should be the same when discussing non actual and actual. You can say this thing is a special case but when it comes to freedom of speech, which is anything that is not based in actual events. writing, speaking, thinking, art. Special circumstances becomes a real slippery slope (which can also be brought up as a fallacy which like all "fallacies" depend a lot on what else backs them up on how they are being presented)