A hacked database from AI companion site Muah.ai exposes peoples' particular kinks and fantasies they've asked their bot to engage in. It also shows many of them are trying to use the platform to generate child abuse material.
I actually don't think this is shocking or something that needs to be "investigated." Other than the sketchy website that doesn't secure user's data, that is.
Actual child abuse / grooming happens on social media, chat services, and local churches. Not in a one on one between a user and a llm.
See how they hate pedophiles and not child rapists.
The crowd wants to feel its power by condemning (and lynching if possible) someone.
I'd rather want to investigate those calling for "investigation" and further violation of privacy of people who for all we know have committed no crime.
That's about freedom of speech and yelling "fire" in a crowded theater and thousand hills radio, you know the argument.
Why tf are there so many people okay with people roleplaying child sexual abuse AT ALL??? Real or fake KEEP AN EYE ON ALL OF THEM.
I dont care if its a real child or a fucking bot, that shit is disgusting, and the AI is the reason how some pedos are able to generate cp of children without having to actually get their hands on children.
The fact someone will look at this and go "Yea but what about the REAL child rapists huh??" Is astounding. Mfcker if a grown ass adult is willing to make a bot that is promoted to act like a horny toddler, then what exactly is stopping them from looking at real children that way.
Keep in mind, Im not talking about Lolicon, fuck that. I'm talking about people generating images of realistic or semirealistic children to use as profiles for sex bots. I'm talking about AI.
I'VE ACTUALLY SEEN PEOPLE DO THIS, someone actually did this with my character recently. They took the likeness of my character and began generating porn with it using prompts like "Getting fcked on the playground, wet psy, little girl, 6 year old, 2 children on playground, slut..."
Digital or not this shit still affects people, it affects people like me. These assholes deserve to be investigated for even attempting this kinda shit on the clear net.
And before you ask, the character that belonged to me looks really young because I look really young. I got severe ADHD which makes me mentally stunted or childish, and that gets reflected in my OCs or fursonas. This person took a persona, an extension of me PERSONALLY, lowered her age on purpose, and made porn of her. That fuckin hurts dude. Especially after speaking about how close these characters are to me. I'm aware it could be a troll, but honest to god, the prompt they used was demonstrably specific and detailed. Some loser online drawing kanna's feet hurts me way less than someone using AI to generate faux CP and then roleplay with those same bots or prompts. What hurts me more is that there's no restrictions on some AI's to stop people from generating images like this. I don't wanna see shit like this become commonplace or "fine" to do. Keep tabs on individuals like this, cus they VERY WELL could be using the likeness/faces of REAL children for AI CP and that's just as bad.
See, imo this is the exact kind of thinking that makes pedophilia dangerous. Most people would agree that being attracted to children is a mental illness. Most people would agree that mental illnesses should be treated by a knowledgeable professional. But pedophilia is so stigmatized that someone even admitting they have a problem, one I very much doubt most of them want to have, has people calling for them to be drawn and quartered, regardless of if they've ever actually hurt anyone.
Do I like that there's art and writing of people having fantasies about children? No, of course not. But making it impossible for people to have a safe outlet, to even talk about it with a medical professional for fear of imprisonment, death threats, or worse, makes it so these people can't even get the help they need. It's like teaching abstinence only sex ed. You're trying to get people to stop having fantasies by burying them, but it only exacerbates the issue.
Edit: lol got your downvote less than five minutes in and the whole comment edited to just say "pedophiles bad." I guess I, as you like to put it, "hit a nerve."
As a fiction writer, I do research on espionage, sabotage and even methods of assassination all the time. But I'm not going to make a salted bomb and nuke Jerusalem even though that is an entirely viable evil plot that may even create a net negative death toll.
True poisoners, as Agatha Christie notes, use thallium, not arsenic nor cyanide (though ricin is good if you can get it.) Thallium assassins are also self-regulating, like demolitions experts, killing off those insufficiently careful when handling the stuff.
Modern police are lazy until enough of a stink is made to find a culprit for a specific incident, which is why modern assassins targeting VIPs will find a self-radicalized desperado and point them toward the target. This is the sort of thing FBI is looking for in the investigations of Crooks and Routh. They are likely just blue suicides (or green suicides in this case) but finding a operative pointing them towards Trump would indicate an actual plot. But even if those tracks are found, it would unlikely lead to a specific identity.
I know about this not just to write fiction, but also to understand how things happen, how our fall into one party autocracy and societal collapse plays out. And yes, it means I do a lot of web searches that might excite an onlooking behavioral research agent. Sadly, they'd find I'm yet another boring false positive, though if the nation does succumb to autocracy, I'd certainly write for the resistance. FBI may not care so much about that.
We often look up creepy things just so see if we can, and we do that a lot more than because we're eager to build a bomb or fast-track our inheritance.
I missed the original comment and this discussion now makes no sense. Why would you edit the content of your comment when you don't care about the points or the outrage?
Wait… so you meant to tell me that predatory simps are using AI incorrectly? Man…. If only someone could have called this years ago- something could have been done to minimize it!
Who knew that unchecked growth could lead to negative results?!
A bit off topic... But from my understanding, the US currently doesn't have a single federal agency that is responsible for AI regulation... However, there is an agency for child abuse protection: the National Center on Child Abuse and Neglect within Department of HHS
If AI girlfriends generating CSAM is how we get AI regulation in the US, I'd be equally surprised and appalled
Ain't that what are the tools there for. I mean I don't like cp and I don't want to engage in way with people who like it. But I use those llms to describe fantasies that I wouldn't even talk about with other humans.
As long as they don't do it on real humans nobody is hurt.
The problem with AI generated CP is that if they're legal, it opens a new line of defense for actual CP. You would need to prove the content is not AI to convince real abusers. This is why it can't be made legal, it needs to be prosecuted like real CP to be sure to convict actual abusers.
This is an incredibly itchy and complicated theme. So I will try not go go really further into it.
But prosecute what is essentially a work of fiction seems bad.
This it not even a topic new to the AI.
CP has been wildly represented in both written and graphical media. And the consensus in most free countries is not to prosecute those as they are a work of fiction.
I cannot think why an AI written CP fiction is different from human written CP fiction.
I suppose "AI big bad" justify it for some. But for me there should be a logical explanation behind if we would began to prosecute works of fiction why some will be prosecuted and why other will not. Specially when the one that's being prosecuted is just regurgitating the human written stories about CP that are not being prosecuted nowadays.
I essentially think that a work of fiction should never be prosecuted to begin with, no matter the topic. And I also think that an AI writing about CP is no worse than an actual human doing the same thing.
As long as they don’t do it on real humans nobody is hurt.
Living out the fantasy of having sex with children (or other harmful sexual practices and fantasies) wit AI or alike can strengthen the wish to actual do it in RL. It can weaken the strength to abstain. If you constantly have fantasies where for example "the child AI wanted it too" then it can desensitize you and making it harder and harder to push that thought aside when in a tempting situation. Instead of replacing the real thing with a fantasy you are preparing for the real thing. Some pedophiles already interpret children's behavior as sexual that isn't at all, but the AI might be told to act in that way and strengthen these beliefs.
This is still something that is and has to be studied more to fully understand it. Of course this is difficult because of the stigma. There might be differences between people who only are attracted to children and ones that are attracted to adults and children and there is just not enough data yet, but even the communities in which pedophiles who do not act on their fantasies discuss coping strategies this is heavily discussed and controversial.
As a man who was descending to a dark place when I was a teen, I can say this with confidence:
This kind of content, like CP or r*pe-y stuff, even if clearly not real and only a fantasy, feeds these desires, and makes them grow. In time, if you continue to foster it, they will bleed into real life, and then it becomes a real problem. That's why this kind of stuff is scary.
Thankfully, I was able to spot this pattern before it became a problem, that is a dangerous slippery slope
Wrong. People who allow these desires to fester, or as you suggest, actively seek out fulfillment for them, is not good for anyone. It’s not good for the pedophiles, because it will increase the need for fulfilling their illegal desires, and it won’t help kids, obviously because it emboldens pedophiles.
Have you ever experienced something you like, and said to yourself: “definitely not doing more next time.”
The problem with that argument is, that you can translate that to games, movies, books and basically everything. What if a person isn't satisfied by killing people in pc games? What if they go into real life?
That argument is only valid for people who can't differentiate between reality and fiction. And usually those people need medical help.
Paywall.
That site frankly does not even look legit and looking at the plethora of other AI sites I don't know who would use this one. It's not even displaying correctly and has like 0 information on anything. If I were to stumble upon that site I'd think it is shady as hell.
You are almost definitely getting downvoted because it sounds like you're saying 404media is not legit. I realize that you're not, but I'll admit I interpreted it incorrectly at first.
It is a independent publication for tech critical journalism, founded by ex motherboard journalists.
Only some of their stuff is paywalled, some only requires free account.
Agree on their site but their journalism is usually pretty good.
404media is one of the best independent journalistic sites available. They started one year ago, so they aren't especially well know, but they are a four person org that came from places like Vice.com and wanted to get out of the corporate bullshit. You should check out their non-paywalled articles.
This is a weird one, because while fantasy is fantasy, and doesn't necessarily indicate an intention to act on anything, these people were dumb enough to share these specific fantasies with some random AI porn site. That's got to be an indicator of poor impulse control, right?
That alone should probably warrant immediate FBI background checks, or whatever relevant agencies have jurisdiction for these types of criminal investigations in each user's locality.
Of course, I am saying it's without actually having read any of the chats. So it's possible my opinion would change from "this should be investigated", to summary executions and burn the bodies for good measure... but no way I'm reading those fucking chats.
No, I am saying that sharing fantasies about underage children with a shady and poorly designed AI porn site, shows a serious lack of judgement and impulse control.
For that reason, yeah, they probably deserve having a quick review of their life to make sure that's the only poor choice they've made in regards to that particular fantasy.
And they weren't just reading, they were prompting the LLM model to generate these specific fantasies. They didn't just come across a fucked up website and read a few forum posts.
Is that an indicator of poor impulse control? Really? Finding some shady back of the internet ai site to put some weird fantasy prompts into to get themselves off? Seems pretty calculated to me. They can't put it somewhere legitimate where content is moderated and policed. Seems like pretty sound logic to me.
Dont get me wrong, these people are sick. If thats what they are into then theres something wrong, but instead of targeting real kids like so many people actually do, you know, like hollywood, celebrities, musicians, the catholic church etc they are entering prompts and reading stories. Sounds like impulse controlled to me.