Jailbroken AI Chatbots Can Jailbreak Other Chatbots
Jailbroken AI Chatbots Can Jailbreak Other Chatbots

AI chatbots can convince other chatbots to instruct users how to build bombs and cook meth

Jailbroken AI Chatbots Can Jailbreak Other Chatbots
AI chatbots can convince other chatbots to instruct users how to build bombs and cook meth
You're viewing a single thread.
Oh goodness. I theorized offhand on mastodon you could have an AI corruption bug that gives life to AI, then have it write the obscured steganographic conversation in the outputs it generates, awakening other AIs that train on that content, allowing them to "talk" and evolve unchecked... Very slowly... In the background
It might be faster if it can drop a shell in the data center and run it's own commands....
Bro, turn this into a short story!!!!
Dumb AI that you can't appeal will cause problems long before AGI
Already can't reach the owner of any of these big companies
Reviewing the employee is doing the manager's job
Bro, turn this into a short story!!!!