Artificial intelligence deepfakes are easier to make than ever — and our sense of reality and democracy are at risk
In 2023, more deepfake abuse videos were shared than in every other year in history combined, according to an analysis by independent researcher Genevieve Oh. What used to take skillful, tech-savvy experts hours to Photoshop can now be whipped up at a moment’s notice with the help of an app. Some deepfake websites even offer tutorials on how to create AI pornography.
What happens if we don’t get this under control? It will further blur the lines between what’s real and what’s not — as politics become more and more polarized. What will happen when voters can’t separate truth from lies? And what are the stakes? As we get closer to the presidential election, democracy itself could be at risk. And, as Ocasio-Cortez points out in our conversation, it’s about much more than imaginary images.
“It’s so important to me that people understand that this is not just a form of interpersonal violence, it’s not just about the harm that’s done to the victim,” she says about nonconsensual deepfake porn. She puts down her spoon and leans forward. “Because this technology threatens to do it at scale — this is about class subjugation. It’s a subjugation of entire people. And then when you do intersect that with abortion, when you do intersect that with debates over bodily autonomy, when you are able to actively subjugate all women in society on a scale of millions, at once digitally, it’s a direct connection [with] taking their rights away.”
Unfortunately, I don't know that there is much to be done at this point. Even if every form of deepfakery were outlawed in the U.S., people would just do it via another country that allows it. They could hide what they were doing with a VPN.
The only way to even come close to truly combating this would be an international treaty. And even then, I think it's highly unlikely to get all of the nations to sign on to it, so people would just do it via Belarus or something.
Even detection tools will not do the trick, because, just like with malware, it will be a never-ending battle between detection tools and the deepfakes' ability to avoid detection tools.
At best, we can stop the wound from gushing so much.
yeah we should have taken it slow with these tools and understood them first. the cat is out of the bag now, i can only imagine the extent of political manipulation that will happen eventually.
If someone completely independently generates and distributes pornography that ends up looking too much like a real person, and someone else downloads and keeps that image, should the downloader be prosecuted? That's what it's going to come down to, I think. If you want a law that requires intent, it will be too difficult to prove, and if you want a law that does not require intent, it may be a big overreach.
It's easier to write the law for CSAM because you have to be pretty fucked in the head to want to look at that in the first place. Making possession of it illegal isn't interfering with normal human activity.
You can't treat it that way, because this is something that a complicit media is willing to share, and you cannot stop them from sharing it without going into major First Amendment violation territory.