Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
venturebeat.com Meet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training data
Nightshade was developed by University of Chicago researchers under computer science professor Ben Zhao and will be added as an option to...
You're viewing a single thread.
View all comments
89
comments
It should be pretty easy to filter out everything that is not visible to humans.
12 0 Reply
You've viewed 89 comments.
Scroll to top