Doesn't everything do this? If someone gets access to your hard drive, your fucked anyways. AI chat logs are about the least problematic thing on there.
Apparently MacOS apps can be sandboxed and store data securely such that no other apps can access it, in an encrypted format. I wasn't aware of this either, but it's been a while since I've used MacOS. It sounds like the ChatGPT app explicitly opted out from this sandboxing model.
Okay in that case I’m even more okay with this. I like the idea of having text files on my computer representing my stuff. Like, plain text I can explore using cd, tree, and less.
I’ve actually wanted to make myself a to-do app that stores the to-do items as nested bullets in markdown so that if I want I can make lists in markdown and access them from my gui and vice-versa.
Many people now use ChatGPT like they might use Google: to ask important questions, sort through issues, and so on. Often, sensitive personal data could be shared in those conversations.
Like my Firefox profile data isn’t encrypted and I don’t see anyone sweating over that. This stupid chat history not being encrypted ain’t that different.
So? OpenAI is to my knowledge open about incorporating conversations into future versions of GPT, and you can often get it to replay training data in full, so your effectively publishing your conversations to the open internet anyway. This feels more like saying that my lemmy comments are stored in plain text on my phone than any sort of security violation.
It's something that should be publicised not because OpenAI has promised privacy, but because a lot of people seem to assume it where it has not been offered, and they need to be reminded that they're kind of out to lunch on the issue.
Like, people in companies keep using these things to write reports with privileged information. People need to be informed as gently but alsonas firmly as possible that they're sending this stuff over the internet to an organization that considers everything it can see to be its own.