← All Posts
Privacy 5 min read
Digital Disconnections
Digital Disconnections The team

Instagram just ended encrypted messaging. We never had access to begin with.

There is a difference between a promise and a structure. This week, the difference became visible.

On May 8, 2026, Instagram will remove end-to-end encrypted messaging from its platform. The announcement landed quietly — a policy update, a date, a support article. No scandal, no breach, no dramatic reversal. Just a company exercising the right it always had: to change what it had given.

This is worth sitting with for a moment.

End-to-end encryption was real. It worked as described. The cryptography was sound. And still — because it lived inside a platform, because it existed as a feature the platform chose to build and could choose to remove — it is being removed. The protection was structural until the moment the structure changed.

We are not writing this to criticize the decision. Companies change products. Terms shift. That is the nature of software as a service. We are writing this because the moment is clarifying something that usually lives in the background of how privacy works: what a platform gives, a platform can take back.


The feeling you might be having right now

If you used Instagram’s encrypted messaging and you feel, today, a particular kind of low-level unease — not quite betrayal, not quite surprise — we recognize that feeling. We have heard it described many ways. I thought it was different. I thought this one was actually private.

That is not naivety. It is not your fault for trusting. The promise was made in good faith, and the cryptography was genuine, and the decision to remove it is probably driven by factors that have nothing to do with ill intent. The problem is not bad actors. The problem is structure. When privacy depends on a policy decision by a platform, the structure allows the policy to change. That is a design fact, not a moral failure.


The difference architecture makes

Private Assistant — our on-device AI — has never had access to your conversations. Not because we promised we wouldn’t look. Because there is nothing to look at.

When inference runs on your device, the data never leaves. There is no server processing your words and sending results back. There is no pipeline to intercept, no policy to revoke, no feature team to decide that the privacy guarantee no longer fits the business. The architecture makes the promise redundant — not because promises are bad, but because the structure makes them unnecessary.

This is what we mean when we say private by architecture, not by promise. A promise is a statement about intent. An architecture is a statement about what is possible. Instagram’s encrypted messaging was a promise backed by real cryptography — and it still ended. Private Assistant’s privacy is backed by a structure: the data exists on your device, the processing happens on your device, and the result is that there is simply nothing to send.

We built it this way because the alternative — cloud inference, server-side processing, a privacy promise you’d have to take on faith — felt like the same structure that produced today’s announcement. We wanted something that could not be revised away.


What this is not

We are not saying Instagram is bad, or that E2E encryption doesn’t matter, or that platform privacy is worthless. On-device matters because encrypted messaging matters — because the impulse that made people want encrypted DMs is real and legitimate and worth protecting more durably.

The people who feel that low-level unease today are right to feel it. The question is only what to do with it. A policy can be the best available option in a world where architecture isn’t possible yet. It was, for a while. The architecture is possible now.

We just thought you should know.


Private Assistant runs entirely on your device. Your AI. Your phone. Nothing else.

Privacy you can see. Software that makes collection structurally impossible. No cloud. No subscription. No data collection.

Explore Private Assistant →
Digital Disconnections
Digital Disconnections
Editorial, Digital Disconnections

From the Digital Disconnections editorial team. We write about privacy, on-device AI, and why your data should stay yours.