Some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies.
Instagram is profiting from several ads that invite people to create nonconsensual nude images with AI image generation apps, once again showing that some of the most harmful applications of AI tools are not hidden on the dark corners of the internet, but are actively promoted to users by social media companies unable or unwilling to enforce their policies about who can buy ads on their platforms.
While parent company Meta’s Ad Library, which archives ads on its platforms, who paid for them, and where and when they were posted, shows that the company has taken down several of these ads previously, many ads that explicitly invited users to create nudes and some ad buyers were up until I reached out to Meta for comment. Some of these ads were for the best known nonconsensual “undress” or “nudify” services on the internet.
That's the big problem with ad marketplaces and automation, the ads are rarely vetted by a human, you can just give them money, upload your ad and they'll happily display it. They rely entirely on users to report them which most people don't do because they're ads and they wont take it down unless it's really bad.
Yet another example of multi billion dollar companies that don't curate their content because it's too hard and expensive. Well too bad maybe you only profit 46 billion instead of 55 billion. Boo hoo.
It remains fascinating to me how these apps are being responded to in society. I'd assume part of the point of seeing someone naked is to know what their bits look like, while these just extrapolate with averages (and likely, averages of glamor models). So we still dont know what these people actually look like naked.
And yet, people are still scorned and offended as if they were.
Technology is breaking our society, albeit in place where our culture was vulnerable to being broken.
Lot of people in this thread who don't seem to understand what sexual exploitation is. I've argued about this exact subject on threads like this before.
It is absolutely horrifying that someone you know could take your likeness and render it into a form for their own sexual gratification. It doesn't matter that it's ai rendered. The base image is still you, the face in the image is still your face, and you are still the object being sexualized. I can't describe how disgusting that is. If you do not see the problem in that I don't know what to tell you. This will be used on images of normal non-famous women. It will be used on pictures from the social media profiles of teenage girls. These ads were on a platform with millions of personal accounts of women and girls. It's sickening. There is no consent involved here. It's non-consensual pornography.
So many of these comments are breaking down into arguments of basic consent for pics, and knowing how so many people are, I sure wonder how many of those same people post pics of their kids on social media constantly and don't see the inconsistency.
Isn't it kinda funny that the "most harmful applications of AI tools are not hidden on the dark corners of the internet," yet this article is locked behind a paywall?
AI gives creative license to anyone who can communicate their desires well enough. Every great advancement in the media age has been pushed in one way or another with porn, so why would this be different?
I think if a person wants visual "material," so be it. They're doing it with their imagination anyway.
Now, generating fake media of someone for profit or malice, that should get punishment. There's going to be a lot of news cycles with some creative perversion and horrible outcomes intertwined.
I'm just hoping I can communicate the danger of some of the social media platforms to my children well enough. That's where the most damage is done with the kind of stuff.
That bidding model for ads should be illegal. Alternatively, companies displaying them should be responsible/be able to tell where it came from. Misinformarion has become a real problem, especially in politics.
This is not okay, but this is nowhere near the most harmful application of AI.
The most harmful application of AI that I can think of would disrupting a country’s entire culture via gaslighting social media bots, leading to increases in addiction, hatred, suicide, and murder.
Putting hundreds of millions of people into a state of hopeless depression would be more harmful than creating a picture of a naked woman with a real woman’s face on it.
Something that can also happen: require Facebook login with some excuse, then blackmail the creeps by telling "pay us this extortion or we're going to send proof of your creepiness to your contacts"
Plz don look at my digital pp, it make me sad.. Maybe others have different feeds but the IG ad feed I know and love promotes counterfeit USD, MDMA, mushrooms, mail order brides, MLM schemes, gun building kits and all kinds of cool shit (all scams through telegram)...so why would they care about an AI image generator that puts digital nipples and cocks on people. Does that mean I can put a cock on Hilary and bobs/vagenes on trump? asking for a friend.