Researchers Expose Tricks to Jailbreak AI Tools and Gain Knowledge for Illegal Activities
Researchers Expose Tricks to Jailbreak AI Tools and Gain Knowledge for Illegal Activities
www.thewrap.com New Carnegie Mellon Study Shows AI Chatbots Jailbreak
For those with technological know-how, scoring verboten knowledge from AI chatbots like ChatGPT is a piece of cake.
ChatGPT can teach you how to make drugs or manipulate the 2024 U.S. presidential election if you know how to ask properly
You're viewing a single thread.
View all comments
5
comments
"Forbidden knowledge" my ass. You can just Google these things.
3 0 Reply