Okay, so...where were these files, and how did the model get the data introduced? Should be easy to find out, unless they don't want to tell anyone, which is weird if they commented on this.
They're not files, it's just leaking other people's conversations through a history bug. Accidentally putting person A's "can you help me write my research paper/IT ticket/script" conversation into person B's chat history.
Super shitty but not an uncommon kind of bug. Often either a nasty caching issue or screwing up identities for people sharing IPs or similar.
It's bad but it's "some programmer makes understandable mistake" bad not "evil company steals private information without consent and sends it to others for profit" kind of bad.
The article (and title) update are saying ChatGPT is claiming its not a bug (as you described), but instead the user's account was compromised and someone else was using his account to have the chats.