X
Business

Is Microsoft Recall a 'privacy nightmare'? 7 reasons you can stop worrying about it

It's one of the signature features of the next-generation Microsoft Copilot+ PCs, and at first glance it acts like the worst kind of spyware. But it's getting a bad rap.
Written by Ed Bott, Senior Contributing Editor
abstract spy concept with lots of eyes
filo/Getty Images

There's an entire class of frankly creepy software designed to monitor every move someone makes on a smartphone or computer, often saving surreptitious screenshots of activity for review by the person who installed the app (typically a parent or a jealous spouse).

Those apps are usually filed under a category heading like "hidden screen recorders" or "parental monitoring tools," but we all know what they really are: spyware.

Also: How to find and remove spyware from your phone

All of which explains why Microsoft Recall, one of the signature features of the next-generation Microsoft Copilot+ PCs, is getting such a bad rap. Its primary job, after all, is to snap screenshots of your activity every few seconds, store them in an encrypted folder, and index them so that the person who set up that feature can review activity on that Windows PC after the fact.

That sounds an awful lot like spyware, doesn't it? But there's one big difference: You're the person setting up the screen recorder, and you're the person reviewing its results. No one else, including Microsoft, has access to that data. Most importantly, there's nothing hidden about it.

Those distinctions apparently don't matter to general-interest media sources like CNN or the BBC, where you might have read that Microsoft Recall is a "privacy nightmare." But that's a naïve and fundamentally inaccurate characterization.

Recall solves a very common problem. I can't remember the name of that website I visited last week, but I know enough about it to describe the page. Likewise, I don't know the name of the file containing an important contract I reviewed recently, but I remember a few details about it. Those scenarios are tailor-made for an AI-powered local search engine that can sift through your activity and not just files.

Also: How to screen record in Windows 10 or 11

Don't get me wrong. There are privacy issues associated with Microsoft Recall, just as there are with any feature designed to store and index personal data. But Microsoft appears to have addressed most of those issues in its design. And since the feature has yet to ship, it's not possible to judge how effective that design is. The only information anyone (including me) has to go on for now is what Microsoft has published in its brief descriptions and demos of Microsoft Recall.

So, what's the real story? Here's what we know so far.

1. You can turn this feature on or off during initial setup.

When you set up a new PC (or a new user account) that supports the Microsoft Recall feature, the initial setup experience includes a page for its settings. The default setting is on, but you can choose to turn it off, and you have the option to adjust settings. (It would be better if the default setting was unselected and required you to opt in.)

2. Those screen captures are stored and processed locally.

The AI that analyzes Recall snapshots runs locally. According to Microsoft, "No internet or cloud connections are required or used to save and analyze snapshots. Your snapshots aren't sent to Microsoft. Recall AI processing occurs locally, and your snapshots are securely stored on your local device only."

3. No one else can access the Recall data.

The folder where snapshots are stored is encrypted by default on Windows 11 PCs and is restricted to the signed-in user profile. Microsoft says it can't access or view the snapshots. That's consistent with its approach to the indexes it uses to search local user files, and it has no incentive to violate that commitment.

4. You can specify that certain apps are never recorded.

The Recall settings page allows you to specify how much storage is set aside for snapshots. It also allows you to filter out apps that you never want to see in your Recall snapshots. (Any content that is viewed in a private-mode browser or is protected by rights-management features is already protected.)

5. You can delete a snapshot.

If you see that a snapshot contains information that you'd rather not preserve, such as a password or a confidential document, you can delete it. You can also delete all activity for a specific time period, which might come in handy if you've been working on a sensitive project that you don't want preserved.

6. Your IT staff has ultimate control over Recall.

Unsurprisingly, Microsoft has enabled group policy and mobile device management settings that administrators can use to disable this feature completely. If that configuration is turned on for your managed device, all saved snapshots are deleted immediately, and you won't have the option to enable the feature going forward.

7. There are still risks, but they're limited.

The data that's saved by Microsoft Recall includes potentially sensitive information. If you change a password on a site that doesn't properly mask new password fields, your new password might be captured in a snapshot. Likewise, data you view as part of a website search can easily be saved and stored, and that data can potentially be incriminating or embarrassing.

For most people, the likelihood that that information will be exposed is small. The population that is most at risk includes journalists and activists who cross borders into hostile countries or who are targeted by police or security services. It also includes anyone in a marginalized population who might be exercising rights that are not supported by the jurisdiction they live in.

But is it spyware? Not under any reasonable definition of the word. You can't spy on yourself, and if someone else is able to access your local, encrypted Recall data … well, you have bigger problems.

Editorial standards