Few files in Brussels have been reopened, redrafted and rejected as often as the European Commission's Child Sexual Abuse Regulation (CSAR) — the proposal critics have nicknamed Chat Control. Nearly four years after it was first tabled in May 2022, EU member states remain split over whether messaging providers should be compelled to scan the contents of private chats, including those protected by end-to-end encryption, for illegal material. Layered on top of that fight, the Commission's ProtectEU internal security strategy, unveiled in April 2025, signals a parallel ambition: a new framework for 'lawful access' to encrypted communications by 2026. Together, these initiatives represent the most consequential test of European digital rights in a decade — and a serious risk to the security architecture that underpins the single market.
What is actually on the table
The CSAR text would empower national authorities to issue 'detection orders' requiring interpersonal communications services — from WhatsApp and Signal to iMessage and Telegram — to identify known and unknown child sexual abuse material (CSAM), and in earlier drafts, grooming behaviour. Because end-to-end encryption prevents server-side scanning, the only technically feasible compliance route is client-side scanning (CSS): scanning content on the user's own device before it is encrypted, and flagging matches to authorities.
ProtectEU, presented by Commissioner Magnus Brunner in April 2025, goes further. It commits the Commission to producing a 'Technology Roadmap on encryption' and exploring legislative options for lawful access to data — a phrase that in security circles is widely read as a euphemism for mandated backdoors or key escrow. The strategy builds on the work of the High-Level Group on Access to Data for Effective Law Enforcement, established in 2023, whose 42 recommendations included pressuring providers to ensure communications remain 'accessible' to authorities.
The security case against client-side scanning
No serious cryptographer disputes that CSAM is a horrific crime and that platforms must do more to combat it. The disagreement is over means, not ends. A 2021 paper by fifteen of the world's leading cryptographers — including Ross Anderson, Whitfield Diffie, Bruce Schneier and Ron Rivest — concluded that client-side scanning 'creates serious security and privacy risks for all society' and that the technology cannot be limited to a single use case. Once a scanning infrastructure exists on a billion devices, the pressure to expand the target list — from CSAM to terrorism, to copyright, to political content — is structural rather than hypothetical.
Europe has seen this story before. Apple announced a CSAM detection system in 2021 and abandoned it the following year after researchers demonstrated hash collisions and after the company's own engineers warned that the architecture would create surveillance vectors that hostile states would inevitably exploit. Signal President Meredith Whittaker has repeatedly stated that the company would withdraw from any market that legally mandated the breaking of its encryption. Threema and Tutanota, both EU-based, have echoed the warning.
A legal collision course
The European Court of Human Rights closed off one path in Podchasov v. Russia (February 2024), ruling that mandates requiring providers to decrypt end-to-end encrypted communications violate Article 8 of the Convention. The European Data Protection Supervisor and the European Data Protection Board issued a joint opinion in 2022 calling CSAR's generalised scanning provisions disproportionate. The European Parliament's negotiating position, adopted in November 2023, explicitly excluded end-to-end encrypted services from detection orders — a red line that successive Council presidencies, from Belgium to Hungary to Poland, have tried to erase.
Civil society has been unusually unified. EDRi, Access Now, the CCC, La Quadrature du Net and a coalition of more than 80 organisations have warned that the regulation would amount to mass surveillance of private correspondence. They are joined, importantly, by the UN Special Rapporteur on Privacy, who concluded the proposal was incompatible with international human rights law.
A pro-innovation alternative
The choice is not between scanning everyone and doing nothing. Proportionate, evidence-based measures that have demonstrably reduced CSAM circulation include:
- Better-resourced national hotlines and Europol's EC3, which already coordinates take-downs of unencrypted hosted content.
- Targeted, judicially-authorised investigations against suspects, which remain lawful and effective under existing instruments such as the European Investigation Order.
- Metadata analysis and account-based signals — techniques that do not require breaking encryption and that already underpin most successful prosecutions.
- Safety-by-design obligations under the Digital Services Act, which require platforms to assess systemic risks without prescribing the technical means.
This approach would preserve what is arguably Europe's most valuable digital export: a legal commitment to confidential communication. Banks, hospitals, journalists, lawyers, dissidents and ordinary citizens depend on the same encrypted rails. Compromising them to catch a sliver of offenders who will simply migrate to fringe tools is a poor security trade.
What Brussels should do
The Polish Council presidency closed in June 2025 without consensus; the Danish presidency that followed has signalled a fresh push. The right outcome is the one the European Parliament has already proposed: keep child-protection obligations focused on content that providers can lawfully access, fund victim-support and investigative capacity, and abandon the client-side scanning mandate. On ProtectEU, the Commission should publish its 'technology roadmap' transparently, subject it to independent cryptographic review, and rule out architectural backdoors. A Union that built the GDPR cannot credibly be the one that legislates away private communication.