University of Chicago researchers release Nightshade to public, a tool that is intended to "poison" pictures in order to ruin generative models trained on them.
You're viewing a single thread.
If this technology is so great, why does the site not show any before / after examples? Let alone demonstrating that it does what he claims?
Because he can't.