EU encryption policy

Instagram's Encryption Retreat Lands in the EU's Chat Control Crossfire

Meta's quiet rollback of opt-in E2EE for Instagram DMs collides with GDPR, ePrivacy, and Brussels' unresolved fight over client-side scanning.

Encryption Under Pressure in the EU People of Internet Research · EU 2B+ Instagram monthly users Global monthly active user base af… 2024 ECtHR ruling year Podchasov v. Russia held weakening… 4 Years CSAR debated EU Chat Control proposal has been … Art. 32 GDPR security article Names encryption as a benchmark fo… peopleofinternet.com

Key Takeaways

In early May 2026, Meta quietly switched off the opt-in end-to-end encryption (E2EE) feature it had introduced years earlier for Instagram direct messages. The company framed the change as a product simplification. In Brussels, it is being read as something else: a unilateral downgrade of a security property that hundreds of millions of EU users had been told to rely on, landing in the middle of the bloc's most contentious encryption debate in a decade.

The timing matters. The EU is still wrestling with the proposed Child Sexual Abuse Regulation (CSAR) — widely known as “Chat Control” — which in successive Council drafts has flirted with mandatory detection orders that would effectively require providers of interpersonal communications services to scan private messages, including those protected by E2EE, via client-side scanning. A platform voluntarily weakening encryption right as legislators debate whether to mandate scanning is not a neutral product decision. It reshapes the political baseline.

What changed, and why it is more than a feature flag

Instagram's DM product had offered opt-in E2EE alongside Meta's broader push, announced in 2023, to make end-to-end encryption the default on Messenger. The opt-in path was always a half-measure — security researchers have long argued that encryption only meaningfully protects users when it is on by default — but it gave privacy-conscious users, journalists, sources, lawyers, abuse survivors, and activists a usable option inside a platform they already had. Removing it pushes those users either to less-functional alternatives or, more likely, to no protection at all.

Under EU law, that has consequences. Article 32 of the GDPR requires controllers to implement “appropriate technical and organisational measures” to secure personal data, with encryption explicitly named as an example. The ePrivacy Directive (2002/58/EC) separately requires confidentiality of communications. Neither instrument mandates E2EE specifically, but regulators have repeatedly treated the availability of strong encryption as evidence of compliance with the “state of the art” standard. Withdrawing a previously offered protection invites the obvious question: was the prior risk assessment wrong, or is the new one?

The Strasbourg backdrop

European courts have already begun answering. In Podchasov v. Russia, the European Court of Human Rights held in February 2024 that requirements weakening end-to-end encryption are incompatible with the right to privacy under Article 8 of the European Convention on Human Rights. The Court found that such measures “cannot be regarded as necessary in a democratic society.” That ruling does not bind a private company's product choices, but it sets the constitutional ceiling for what EU member states can demand of platforms — and, by extension, what platforms can credibly claim is “reasonable” security practice.

The European Data Protection Board and the European Data Protection Supervisor have been blunter still. Their 2022 joint opinion on the CSAR proposal warned that generalised scanning of private communications would be disproportionate and would “undermine the essence of the right to private life.” The EDPB has reiterated since that client-side scanning is, in effect, a backdoor wearing a politer name.

The proportionality problem cuts both ways

People of Internet's editorial position has been consistent: strong, default end-to-end encryption is a public good, and policy that erodes it — whether through legislative mandate or quiet platform rollback — should be treated with deep skepticism. But the proportionality principle that underpins EU fundamental-rights law cuts in both directions.

The right answer is not to choose one critique over the other. It is to recognise that both reflect the same underlying principle: communications confidentiality is a default, not a feature that can be granted, denied, or scanned at will.

What the Commission and national DPAs should — and should not — do

The temptation, in moments like this, is for regulators to reach for the heaviest tool available. That would be a mistake. A GDPR enforcement action premised on the theory that withdrawing optional E2EE is per se unlawful would be legally fragile and politically counterproductive — it would invite the argument that the EU is mandating encryption while simultaneously legislating to break it.

A more proportionate response is available. Supervisory authorities can:

And legislators should take the cue. The CSAR file should be closed, not reopened, with detection-order language stripped from any final text. A regulatory environment that punishes platforms for offering E2EE while pressuring them to scan it is the worst of both worlds — and Instagram's retreat is the predictable result.

The wider lesson

Encryption policy has become a stress test for whether the EU's digital rulebook can hold a coherent line. The DSA, GDPR, and ePrivacy frameworks all point toward stronger confidentiality. The Chat Control debate, and now a major platform's quiet capitulation, point the other way. Resolving the contradiction requires Brussels to do something it has been reluctant to do: say plainly that end-to-end encryption is part of the European model of the open internet, not an obstacle to it.

Sources & Citations

  1. ECtHR judgment in Podchasov v. Russia (Article 8 / encryption)
  2. EDPB-EDPS Joint Opinion 4/2022 on the CSA Regulation proposal
  3. European Commission proposal for the CSA Regulation ('Chat Control')
  4. GDPR Article 32 — Security of processing
  5. EFF — Recommendations on the EU's Digital Fairness Act (May 2026)
Share this analysis: