Skip to content
i don't recall

Windows Recall demands an extraordinary level of trust that Microsoft hasn’t earned

Op-ed: The risks to Recall are way too high for security to be secondary.

Andrew Cunningham | 562
The Recall feature as it currently exists in Windows 11 24H2 preview builds. Credit: Andrew Cunningham
The Recall feature as it currently exists in Windows 11 24H2 preview builds. Credit: Andrew Cunningham
Story text

Microsoft’s Windows 11 Copilot+ PCs come with quite a few new AI and machine learning-driven features, but the tentpole is Recall. Described by Microsoft as a comprehensive record of everything you do on your PC, the feature is pitched as a way to help users remember where they’ve been and to provide Windows extra contextual information that can help it better understand requests from and meet the needs of individual users.

This, as many users in infosec communities on social media immediately pointed out, sounds like a potential security nightmare. That’s doubly true because Microsoft says that by default, Recall’s screenshots take no pains to redact sensitive information, from usernames and passwords to health care information to NSFW site visits. By default, on a PC with 256GB of storage, Recall can store a couple dozen gigabytes of data across three months of PC usage, a huge amount of personal data.

The line between “potential security nightmare” and “actual security nightmare” is at least partly about the implementation, and Microsoft has been saying things that are at least superficially reassuring. Copilot+ PCs are required to have a fast neural processing unit (NPU) so that processing can be performed locally rather than sending data to the cloud; local snapshots are protected at rest by Windows’ disk encryption technologies, which are generally on by default if you’ve signed into a Microsoft account; neither Microsoft nor other users on the PC are supposed to be able to access any particular user’s Recall snapshots; and users can choose to exclude apps or (in most browsers) individual websites to exclude from Recall’s snapshots.

This all sounds good in theory, but some users are beginning to use Recall now that the Windows 11 24H2 update is available in preview form, and the actual implementation has serious problems.

“Fundamentally breaks the promise of security in Windows”

This is Recall, as seen on a PC running a preview build of Windows 11 24H2. It takes and saves periodic screenshots, which can then be searched for and viewed in various ways.
This is Recall, as seen on a PC running a preview build of Windows 11 24H2. It takes and saves periodic screenshots, which can then be searched for and viewed in various ways. Credit: Andrew Cunningham

Security researcher Kevin Beaumont, first in a thread on Mastodon and later in a more detailed blog post, has written about some of the potential implementation issues after enabling Recall on an unsupported system (which is currently the only way to try Recall since Copilot+ PCs that officially support the feature won’t ship until later this month). We've also given this early version of Recall a try on a Windows Dev Kit 2023, which we've used for all our recent Windows-on-Arm testing, and we've independently verified Beaumont's claims about how easy it is to find and view raw Recall data once you have access to a user's PC.

To test Recall yourself, developer and Windows enthusiast Albacore has published a tool called AmperageKit that will enable it on Arm-based Windows PCs running Windows 11 24H2 build 26100.712 (the build currently available in the Windows Insider Release Preview channel). Other Windows 11 24H2 versions are missing the underlying code necessary to enable Recall.

Windows uses OCR on all the text in all the screenshots it takes. That text is also saved to an SQLite database to facilitate faster searches.
Searching for "iCloud," for example, brings up every single screenshot with the word "iCloud" in it, including the app itself and its entry in the Microsoft Store. If I had visited websites that mentioned it, they would show up here, too.

The short version is this: In its current form, Recall takes screenshots and uses OCR to grab the information on your screen; it then writes the contents of windows plus records of different user interactions in a locally stored SQLite database to track your activity. Data is stored on a per-app basis, presumably to make it easier for Microsoft’s app-exclusion feature to work. Beaumont says “several days” of data amounted to a database around 90KB in size. In our usage, screenshots taken by Recall on a PC with a 2560×1440 screen come in at 500KB or 600KB apiece (Recall saves screenshots at your PC's native resolution, minus the taskbar area).

Recall works locally thanks to Azure AI code that runs on your device, and it works without Internet connectivity and without a Microsoft account. Data is encrypted at rest, sort of, at least insofar as your entire drive is generally encrypted when your PC is either signed into a Microsoft account or has Bitlocker turned on. But in its current form, Beaumont says Recall has “gaps you can drive a plane through” that make it trivially easy to grab and scan through a user’s Recall database if you either (1) have local access to the machine and can log into any account (not just the account of the user whose database you’re trying to see), or (2) are using a PC infected with some kind of info-stealer virus that can quickly transfer the SQLite database to another system.

Accessing another user's Recall data from another admin account on the same PC. This UAC prompt is the only thing keeping me out, and it's easily dismissed. Once I access the folder, I can see every single screenshot plus the SQLite database with all the OCR data in it.
Accessing another user's Recall data from another admin account on the same PC. This UAC prompt is the only thing keeping me out, and it's easily dismissed. Once I access the folder, I can see every single screenshot plus the SQLite database with all the OCR data in it. Credit: Andrew Cunningham

Beaumont says admin access to the system isn’t required to read another user’s Recall database. Another user with an admin account can easily grab any other user’s Recall database and all the Recall screenshots by clicking through a simple UAC prompt. The SQLite database is stored in plain text, and data in transit isn’t encrypted, either, making it trivially easy to access both the stored database of past activity and to monitor new entries as Recall makes them. Screenshots are stored without a file extension, but they're regular old image files that can easily be opened and viewed in any web browser or image editor.

The other big problem is that because Recall is on by default and you have to manually exclude specific apps or websites from being scraped by it, the SQLite database will keep records of activities that are explicitly meant to be hidden or temporary. That includes viewing pages in Incognito mode in some browsers, emails or messages that you delete from your device, and files that you edit or delete.

Beaumont says he is holding off on publishing some details to “give [Microsoft] time to do something” about the feature as it is currently implemented. But he has pointed to efforts like this “TotalRecall” script as an example of how quickly and easily Recall data can be stolen and searched.

The UI for excluding apps from Recall's scraping.
Filtering sites is done by entering URLs.

There are mitigating factors here. Recall will begin by shipping on just a handful of new Windows 11 systems. It can be turned off entirely if you don’t want to use it, and the controls to disable Recall snapshots for certain apps or sites theoretically give users enough control that they can use Recall as intended without storing overly sensitive information in the database.

But given the sheer amount of data that Recall scrapes, the minimal safeguards Microsoft has put in place to protect that database once a malicious user has access to your PC, and the fact that many PC users never touch the default settings, the risks to user data seem far higher than the potential benefits of this feature.

Microsoft has struggled with security and privacy in its products. Not even a month ago, CEO Satya Nadella pledged to make security the most important thing at the company, following multiple high-profile data breaches and poorly handled information disclosures. Executive pay is being tied in part to security; rank and file employees are being told to “do security,” even when “faced with the tradeoff between security and another priority,” Nadella said. To launch Recall with such obviously exploitable security holes flies in the face of that directive.

Microsoft is hard to trust right now

Recall in the taskbar of a Windows 11 PC.
Recall in the taskbar of a Windows 11 PC. Credit: Andrew Cunningham

Even if Recall were locked down better, another problem is that Windows 11 has eroded its users' trust and patience over time by endlessly pushing Microsoft's other products and services and refusing to respect user choices once they've been made. The company frequently finds new places to put ads; a "clean install" of the operating system comes with unasked-for third-party apps and ongoing notifications about other Microsoft services; the Bing Chat feature and then Copilot were rolled out quickly to the entire user base despite being "preview" products prone to problems.

None of this is directly related to Recall, but it demonstrates Microsoft's willingness to put revenue-squeezing ahead of the user experience, and that makes me inherently skeptical of Windows' AI features in general and Recall in particular. A huge searchable database of PC activity would be a holy grail for advertisers, and given how willing Microsoft has been to muck up Windows 11 in the last two and a half years, it's hard for me to trust that the company will stay committed to keeping the data collected by Recall completely private.

Recall isn’t finalized. It won’t be available on the vast majority of Windows PCs. Even once Copilot+ PCs from Intel and AMD hit the market, it will take at least a year or two for compatible NPUs to show up in midrange and low-end PCs. Those who truly hate it will be able to turn it off, though the list of “at least you can uninstall it/turn it off!” things in Windows 11 is getting frustratingly long.

We’ve also contacted Microsoft to ask about these concerns and whether the version of Recall that you can test on an Arm PC right now is the same one that end users will get on Copilot+ PCs; as of this writing, the company hasn’t responded.

But let's put it this way: Microsoft is building a feature into Windows that is monitoring and logging a ton of data about you and the way you use your PC. Traditionally, we’d call this “spyware.” The difference is that Microsoft is giving this particular data collection feature its blessing and advertising it as a banner feature of its upcoming wave of Copilot+ PCs.

The fact that the data is processed locally rather than in the cloud is a good first step, but it's also the bare minimum. Based on both the permissive default settings and the ease with which this data can be accessed, Recall’s security safeguards as they currently exist just aren't good enough.

If Microsoft really does intend for everyone at the company to “do security,” it needs to put these concerns ahead of its apparently all-consuming drive to insert generative AI features into every single one of its products. Improving Recall before it becomes generally available needs to take priority, even if it delays the launch.

Listing image: Jason Redmond/AFP via Getty Images

Photo of Andrew Cunningham
Andrew Cunningham Senior Technology Reporter
Andrew is a Senior Technology Reporter at Ars Technica, with a focus on consumer tech including computer hardware and in-depth reviews of operating systems like Windows and macOS. Andrew lives in Philadelphia and co-hosts a weekly book podcast called Overdue.
562 Comments
Staff Picks
S
When this deploys, MS has effectively destroyed computer security altogether.

Sure, you can disable it on your machine. But since it's taking screen-grabs, you have to ensure that everyone else with whom you communicate has it disabled as well.

End-to-end encryption will be meaningless because it's taking screen-grabs at the end-points. That means Signal's security is borked, for example. For both ends, because the entire conversation appears in the app window on both ends. Doesn't matter if you exclude Signal from recording on your end.

You can't have any security unless you confirm that both ends have Recall disabled. Assuming it's really disabled when you turn it off, of course. And that the next OS update didn't turn it back on without telling you.