In early May 2026, Meta quietly switched off the opt-in end-to-end encryption (E2EE) feature it had introduced years earlier for Instagram direct messages. The company framed the change as a product simplification. In Brussels, it is being read as something else: a unilateral downgrade of a security property that hundreds of millions of EU users had been told to rely on, landing in the middle of the bloc's most contentious encryption debate in a decade.
The timing matters. The EU is still wrestling with the proposed Child Sexual Abuse Regulation (CSAR) — widely known as “Chat Control” — which in successive Council drafts has flirted with mandatory detection orders that would effectively require providers of interpersonal communications services to scan private messages, including those protected by E2EE, via client-side scanning. A platform voluntarily weakening encryption right as legislators debate whether to mandate scanning is not a neutral product decision. It reshapes the political baseline.
What changed, and why it is more than a feature flag
Instagram's DM product had offered opt-in E2EE alongside Meta's broader push, announced in 2023, to make end-to-end encryption the default on Messenger. The opt-in path was always a half-measure — security researchers have long argued that encryption only meaningfully protects users when it is on by default — but it gave privacy-conscious users, journalists, sources, lawyers, abuse survivors, and activists a usable option inside a platform they already had. Removing it pushes those users either to less-functional alternatives or, more likely, to no protection at all.
Under EU law, that has consequences. Article 32 of the GDPR requires controllers to implement “appropriate technical and organisational measures” to secure personal data, with encryption explicitly named as an example. The ePrivacy Directive (2002/58/EC) separately requires confidentiality of communications. Neither instrument mandates E2EE specifically, but regulators have repeatedly treated the availability of strong encryption as evidence of compliance with the “state of the art” standard. Withdrawing a previously offered protection invites the obvious question: was the prior risk assessment wrong, or is the new one?
The Strasbourg backdrop
European courts have already begun answering. In Podchasov v. Russia, the European Court of Human Rights held in February 2024 that requirements weakening end-to-end encryption are incompatible with the right to privacy under Article 8 of the European Convention on Human Rights. The Court found that such measures “cannot be regarded as necessary in a democratic society.” That ruling does not bind a private company's product choices, but it sets the constitutional ceiling for what EU member states can demand of platforms — and, by extension, what platforms can credibly claim is “reasonable” security practice.
The European Data Protection Board and the European Data Protection Supervisor have been blunter still. Their 2022 joint opinion on the CSAR proposal warned that generalised scanning of private communications would be disproportionate and would “undermine the essence of the right to private life.” The EDPB has reiterated since that client-side scanning is, in effect, a backdoor wearing a politer name.
The proportionality problem cuts both ways
People of Internet's editorial position has been consistent: strong, default end-to-end encryption is a public good, and policy that erodes it — whether through legislative mandate or quiet platform rollback — should be treated with deep skepticism. But the proportionality principle that underpins EU fundamental-rights law cuts in both directions.
- Against mandated scanning: The Chat Control proposals fail proportionality because they impose mass, suspicionless monitoring on every user to address harms that targeted, lawful investigation can address with less intrusive means. The ECtHR has now said as much.
- Against platform retrenchment: A dominant communications platform with billions of users that publicly promises a security property, builds user reliance on it, and then withdraws it without an equivalent replacement is also acting disproportionately to whatever operational cost prompted the change.
The right answer is not to choose one critique over the other. It is to recognise that both reflect the same underlying principle: communications confidentiality is a default, not a feature that can be granted, denied, or scanned at will.
What the Commission and national DPAs should — and should not — do
The temptation, in moments like this, is for regulators to reach for the heaviest tool available. That would be a mistake. A GDPR enforcement action premised on the theory that withdrawing optional E2EE is per se unlawful would be legally fragile and politically counterproductive — it would invite the argument that the EU is mandating encryption while simultaneously legislating to break it.
A more proportionate response is available. Supervisory authorities can:
- Demand Meta's updated Article 35 Data Protection Impact Assessment for Instagram DMs and publish a summary of its adequacy review;
- Use Digital Services Act systemic-risk reporting (Articles 34–35) to require Meta to disclose how the rollback affects risks to fundamental rights, including the rights of minors, journalists, and human rights defenders;
- Clarify, in guidance, that “state of the art” under Article 32 GDPR has a ratchet character — the bar moves forward, not backward.
And legislators should take the cue. The CSAR file should be closed, not reopened, with detection-order language stripped from any final text. A regulatory environment that punishes platforms for offering E2EE while pressuring them to scan it is the worst of both worlds — and Instagram's retreat is the predictable result.
The wider lesson
Encryption policy has become a stress test for whether the EU's digital rulebook can hold a coherent line. The DSA, GDPR, and ePrivacy frameworks all point toward stronger confidentiality. The Chat Control debate, and now a major platform's quiet capitulation, point the other way. Resolving the contradiction requires Brussels to do something it has been reluctant to do: say plainly that end-to-end encryption is part of the European model of the open internet, not an obstacle to it.