New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.
Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.
This is a sad article to read. I'm not a woman nor am I young adult growing up with all this technology that can be leveraged against me. Could you imagine being a junior high or high school student, and having an anonymous classmate creating deepfake porn of you using your yearbook photo? And the children in your class gossiping about you, sharing your porn video/photo online with their friends, and enduring that harassment? It's already well-documented what damages that too much pornography causes on our psychological development, now imagine the consumer of this content being around the victim. That harassment can get so much worse.
I can't even begin to fathom what kind of psychological damage this will cause to the youth. I feel for women everywhere - this is a terrible thing people are doing with this technology. I can't imagine raising a daughter in this environment and trying to help her navigate this problem when some asshole creates deepfake porn of her. My niece is currently getting bullied in school - what if her bullies use these tools against her? This just makes my blood boil.
It's bad enough that since social media has risen and captured the attention span of kids and teenagers, that there is a well-defined correlation with an increase in suicide rates since 2009 (the year Twitter first came out). https://www.health.com/youth-suicide-rate-increase-cdc-report-7551663 . Now, a nonconsensual AI-generated porn era to navigate.
These are dangerous times. This opens persons up for attack, and regulation to increase friction to access these tools is one of the next most important steps to take. Granted, outright bans never work (as the persistent ones will always get their hands on it), but we need to put controls into place to limit access to this. Then we can remediate the root cause to these problems (e.g., proper systemic education, teaching a modified sexual education in schools to address things like consent, etc.).
EDIT:
Wanted to also add after I posted this, that a common prevalent argument I hear parroted by people is this:
People are gonna do this AI generation anyway. It'll get to the point that you won't be able to tell what's real or not, so women can just deny it. You can't prove it's real anyway, so why bother?
This is another way of saying "boys will be boys" and ignoring the problem. The problem is harrassment and violence against women.
After some testing, It might be that the parent commenter just deleted their comment which nuked all the child comments. I can't rememeber if this is what Reddit does. I think it just sais "Deleted by creator", but keeps the children. Could certainly be wrong, though.
Yup it appears that our entire comment chain got nuked. So it is now confirmed that if you delete the parent, then all children get removed as well.
For any reading this message, the context is that we tested it by me replying to OP's previous comment, then OP responding to me, then I deleted my comment to see if their comment also got deleted.
This is another way of saying "boys will be boys" and ignoring the problem.
I don't think that's at all similar. "Boys will be boys" is "we know it's bad, but we can't stop them."
The argument is... is it really bad? After all, isn't it the "scandal" that really causes the damage? It's not like any harm is directly done to the person, someone could've already done this to me, and well, I wouldn't be any the wiser. It's when they start sharing it and society reacts as if it's real and there's something scandalous that there's a problem.
If we stop considering it scandalous... The problem kind of goes away... It's not much different than AI photoshopping a hat on someone that they may or may not approve of.
This opens persons up for attack, and regulation to increase friction to access these tools is one of the next most important steps to take.
I've never researched these tools or used them... But I'd wager that's going to be next to impossible. If you think the war on drugs was bad... A war on a particular genre of software would be so much worse.
Like a lot of things... I think this is a question of how do we adapt to new technology not how do we stop it. If I actually believed this was stoppable, I might agree with you... But it actually seems more dangerous if we try and make the tools hard to obtain vs just giving people plausible deniability.
You mentioned bullying, definitely empathetic to that. I don't know that this would really make things worse vs the "I heard Katie ..." rumor crap that's being going on for decades. Feminism has argued for taking the power away by removing the taboo of women having sex lives
... and that seems equally relevant here.
Either way, it really seems like a lot more research is needed.
"Just stop considering it scandalous" is a severe lack of imagination. Even if/when the stigma of "having a sex life" is gone, the great majority of people consider their sex life to be private. Video floating around that looks like you having sex is a very different thing to hearsay rumors.
Keep in mind that the exact same techniques could be used to sabotage adult relationships, marriages, careers, just as easily as teenage bullying. This isn't a problem society can shrug away by saying sex should be less stigmatized.
Keep in mind that the exact same techniques could be used to sabotage adult relationships, marriages, careers, just as easily as teenage bullying.
And a "video" should ruin those things why?
Literally everything you listed is because society is making a big stink of things that don't matter.
Why should your job care ... even if it's real?
If somebody didn't cheat and there's no other reason to believe that than a ... suspiciously careless video of someone that looks like them... Why in the world should that end their relationship?
Not to mention, AI isn't going to get the details right. It's going to get the gist of it right but anyone who's actually seen you naked is presumably going to be able to find some details that are/aren't off.
Also in terms of privacy, your privacy wasn't violated. Someone made a caricature of you.
Video floating around that looks like you having sex is a very different thing to hearsay rumors.
It's really not, the only reason it is, is because video has been trustworthy for the past century, now it's not.
I hope you folks down voting me have some magic ace up your sleeve, but I see no way past this other than through it. Just like when the atom bomb was invented, it's technology that exists now and we have to deal with it. Unlike the atom bomb, it's just a bunch of computer code and at some point pretty much any idiot is going to be able to get their hands on convincing versions of it. Also unlike the atomic bomb, it can't actually kill you.