New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.
Deepfake Porn Is Out of Control::New research shows the number of deepfake videos is skyrocketing—and the world's biggest search engines are funneling clicks to dozens of sites dedicated to the nonconsensual fakes.
So, I think it's good to identify why non-consenting pornography is considered a bad thing. And why it being considered a bad thing is different than what you are saying here.
Deep fake pornography for the people targeted (which is not just famous people) is incredibly invasive. Your image is out there doing things that you would be horrified to have on camera. It can destroy people's health and cause huge problems, especially for people who are being harassed by others.
It's not Pearl clutching. It's a rather damaging technology that benefits no one.
By default, creating and publishing "deepfake porn" of a real person constitutes defamation against that person, as it carries the false statement "this person posed for this picture" which is likely to cause that person harm. Often, the intention is to cause harm.
As such, we don't need new laws here. Existing laws against defamation just need to be applied.
The idea that there's deepfake tech should help everyone move to the point where nobody should give a shit. It should still be punished criminally but the most punishing aspect should be how stupid it is to get involved with it when literally anyone can do it andits basically impossible to prove its the real deal. What is the value any longer?
Look folks, it's the Simone Biles of mental gymnastics. You have some serious growing up to do if that's your argument. Just because it's potentially fake doesn't make it any less of an invasion of privacy. So your argument is that everyone shouldn't give a shit about privacy, especially their own?