Microsoft’s controversial Recall feature is rolling out across Copilot+ PCs—and it’s dragging your private messages with it. With Recall enabled, Windows now takes constant screenshots, reads every word with AI, and stores it behind a simple PIN. That includes WhatsApp, Signal, and any app you thought was secure.
Microsoft Recall reads everything, even disappearing messages

The problem isn’t just what you do on your own PC. It’s what others see on theirs. As Ars Technica notes, if someone you message has Recall turned on, your end-to-end encrypted chats get screenshotted, parsed with OCR, and logged in a searchable AI database. You can’t stop it. You won’t know it’s happening.
It captures everything on-screen—photos, messages, passwords, even encrypted video. And while Microsoft claims it redacts some sensitive data, early testers are finding the AI’s judgment to be inconsistent at best.
Your privacy now depends on someone else’s PC
Security researcher Kevin Beaumont tested the system. He asked his partner to use his PC with Recall turned on. She guessed his PIN in minutes and uncovered full Signal conversations, including messages that had been deleted. “That isn’t great,” Beaumont deadpanned.
Those screenshots, once taken, are kept on the local device, protected by TPM 2.0 and a PIN—tech that’s already under scrutiny for its limitations. For anyone outside that device’s control, including message senders, there’s zero transparency.
Microsoft’s AI makes surveillance effortless—and invisible
Recall isn’t doing anything new. Anyone can take a screenshot. But the issue is scale and stealth. Microsoft’s AI doesn’t just capture data—it indexes, interprets, and preserves it without notifying anyone on the other side of the conversation.
What used to require technical effort is now baked into the OS. That changes the nature of secure messaging. Your words aren’t just vulnerable at the endpoint—they’re vulnerable anywhere a Windows PC with Recall is watching.
WhatsApp adds AI, raising fresh concerns
At the same time Recall launches, Meta is adding AI to WhatsApp. Features like thread summaries and writing suggestions are being tested under a system called “Private Processing.” Meta claims this all happens in a secure, isolated space—even researchers can audit the model—but questions remain.
Experts like cryptographer Matthew Green worry this is just the start. If AI summaries are being generated, will the next step be sharing insights back to Meta “to improve results”? Once privacy becomes optional, trust becomes fragile.
Secure messaging needs secure devices—AI is breaking that
Meta says its AI can’t see your chats. Microsoft’s AI already can. This isn’t just a tech clash—it’s a privacy fault line. If one device in a conversation is silently screenshotting everything, encryption means very little.
You may trust WhatsApp, Signal, or your favorite app. But unless you trust every device in the chain—especially Windows PCs with Recall—your privacy isn’t guaranteed. And right now, Microsoft’s AI is reading more than it should.