Recall records everything users do on their PC, including activities in apps, communications in live meetings, and websites visited for research. Despite encryption and local storage, the new feature raises privacy concerns for certain Windows users.
At first glance, the Recall feature seems like it may set the stage for potential gross violations of user privacy. Despite reassurances from Microsoft, that impression persists for second and third glances as well.
I've seen a few of these insane AI privacy violations by Microsoft today - I'm assuming from the same event. In every article, Microsoft insists "its okay cause all the data is on the user's device".
First of all, I don't trust a word they say. Second, I'm sure there is some other service by them that would pick these files up and send them to Microsoft anyway. Third, unless the AI is running locally, they need to send that data to their servers for this to function. Fourth, I guarantee this "local" promise will get quietly dropped the millisecond they think they can get away with it. Fifth, these tech companies have been acting in such bad faith they should be hit with an insane amount of regulations at the very least.
I'm generally not against AI, but Microsoft absolutely cannot be trusted with any of this.
Having logs of opened apps and crashes is NOT the same as scraping everything including content. By making stupid quippy jokes as if this is normal, you at best spread ignorance and tacit acceptance, and at worst spread disinformation.
If you use other Microsoft services, they could have your email, exact location and even all the passwords you save for starters.
At the end of the day, you simply don't know all the data they collect and can easily be much more than you trivialized. In the privacy agreements, you will see a ton of "optional" collecting which we all know is turned on by default.
If you think that is still all bullshit, if there is data stored somewhere other than your own PC, it's not your data. If "AI" can reference your exact history, this gets 100x worse than it has in the past.
Microsoft says that the Recall index remains local and private on-device, encrypted in a way that is linked to a particular user's account. "Recall screenshots are only linked to a specific user profile and Recall does not share them with other users, make them available for Microsoft to view, or use them for targeting advertisements. Screenshots are only available to the person whose profile was used to sign in to the device," Microsoft says.
Users can pause, stop, or delete captured content and can exclude specific apps or websites. Recall won't take snapshots of InPrivate web browsing sessions in Microsoft Edge or DRM-protected content
Optional local feature. Of course the thread acts like eggs of the universe.
Ok, but now picture on day 1, MS pops a little message box up on every computer with it installed that says, "Enable advanced functionality?" with a teeny tiny link to a long legal document that, somewhere in it, says that actually with advanced features turned on, they do upload all your data.
Because companies do that, all the time. It allows you to both have press releases saying "we collect no data, we love privacy!" but then actually collect and sell data on like 95% of your customers.