I haven't looked into this much, but I read in some article yesterday, that they're trying to establish these generative AI features as a selling point for Windows. In the article, there was a 10 second ad video for Recall linked, too.
And I imagine, this was somewhat of a hyperbole, but the article author claimed that Recall was the only of these generative AI feature ideas that was any good, but then torpedoed by the privacy issues.
So, yeah, that might be all there is to it. They want to shoehorn AI into there somehow, to make shareholder hype/value go up, and that was just the only real idea they had.
I don't know what specifically Microsoft is planning here, but in the past I've taken screenshots of my settings window and uploaded it to Copilot to ask it for help sorting out a problem. It was very useful for Copilot to be able to "see" what my settings were. Since the article describes a series of screenshots being taken over time it could perhaps be meant to provide context to an AI so that it knows what's been going on.
The phrase "privacy nightmare" gets thrown around a lot, but an online service taking pictures of your screen every few seconds does not sound worth the risk of exposure of personal information.
As for someone needing physical access to your device in order to access those screenshots, there's no way that's correct.
If they're locally stored on your machine, those screenshots can be accessed by an intruder.
Seems like a long walk for an extremely limited scope of benefit.
Yeah, it's not stopping me from commenting. I'm only noting the downvotes in this case because I was making a point elsewhere in the thread about the extremely anti-AI sentiment around here. In this case I'm not even saying something positive about it, merely speculating about the reason why Microsoft is doing this, and I guess that's still being interpreted as "justifying" AI and therefore something worthy of attack.