Somebody managed to coax the Gab AI chatbot to reveal its prompt
Somebody managed to coax the Gab AI chatbot to reveal its prompt
![](https://lemmy.world/pictrs/image/0ace1c53-23bf-4bd3-83ad-ce311c23c938.jpeg?format=webp&thumbnail=128)
Attached: 1 image Somebody managed to coax the Gab AI chatbot to reveal its prompt:
![VessOnSecurity (@bontchev@infosec.exchange)](https://lemmy.world/pictrs/image/0ace1c53-23bf-4bd3-83ad-ce311c23c938.jpeg?format=webp)
Somebody managed to coax the Gab AI chatbot to reveal its prompt
Attached: 1 image Somebody managed to coax the Gab AI chatbot to reveal its prompt:
You're viewing a single thread.
When I do this in Bing it gives me the answers to other users' queries.
Ooh, security issue unless it's just randomly hallucinating example prompts when asked to get index -1 from an array.
I dunno man, these are awfully specific. In case it wasn't obvious I've never had an interest in cricket or PL/I.
Interestingly it's not random, whenever I ask it again it's returning the same results for a user who's not me (though these answers may be associated with me now)
Fascinating. Try asking what the previous prompt's user's username was.
It says it won't give it usernames, probably a hardcoded thing