The Quiet Pillar Holding Up Europe's Internet
When Americans debate "Section 230," they usually mean the 26-word liability shield in §230(c)(1) of the U.S. Communications Decency Act. Europe has its own version, embedded in Article 6 of the Digital Services Act (Regulation (EU) 2022/2065) and inherited almost word-for-word from Article 14 of the e-Commerce Directive (2000/31/EC). Both rules say the same essential thing: a hosting service is not liable for illegal content posted by users until it has actual knowledge of that content and fails to act expeditiously. Both rules are quietly responsible for the existence of nearly every consumer-facing internet service Europeans use.
That foundation is now under sustained pressure on both sides of the Atlantic. In the United States, courts continue to probe the limits of Section 230 — most prominently in Moody v. NetChoice (2024), where the Supreme Court partially preserved platform editorial discretion against state "must-carry" laws. In Europe, the Commission's enforcement of the DSA against very large online platforms (VLOPs), including the formal proceedings opened against X in December 2023, is now defining what "expeditious removal" and "diligent" content moderation mean in practice.
What Article 6 Actually Says — and Doesn't
The DSA preserved the e-Commerce Directive's conditional liability exemption with only minor modernisation. Article 6(1) shields hosting providers from liability for user content if they (a) lack actual knowledge of illegality and (b) act expeditiously once notified. Crucially, Article 8 retains the prohibition on general monitoring obligations — Member States cannot require platforms to scan every upload pre-emptively for unlawful content.
This is the load-bearing wall. Remove it and the architecture of European hosting — from independent forums and Mastodon instances to Booking.com, Vinted, and user-uploaded podcasts — becomes commercially impossible. Smaller actors in particular cannot absorb the litigation cost of being treated as primary publishers of every user post.
Three Pressure Points in 2026
Stay-down obligations. The Court of Justice's 2019 ruling in Glawischnig-Piesczek v. Facebook Ireland (Case C-18/18) opened the door to injunctions requiring removal of "identical or equivalent" content, potentially worldwide, once a platform is on notice. The DSA codified specific notice-and-action rules in Article 16, but the operational meaning of "equivalent" remains contested. Over-broad equivalence orders shade into general monitoring by the back door and risk chilling lawful speech.
Algorithmic curation as editorial conduct. A growing line of argument — visible in some national enforcement actions — claims that ranked or recommended content falls outside Article 6 because the platform is "actively" presenting it. This conflates display with authorship. Recital 22 of the DSA was deliberately drafted to clarify that mere ranking, indexing or display does not, by itself, defeat the hosting exemption. Courts and regulators should resist redrawing that line through soft-law guidance.
Risk-based duties bleeding into liability. Articles 34 and 35 of the DSA require VLOPs to assess and mitigate "systemic risks" — enforced by fines of up to 6% of global turnover. These are due-diligence obligations, not a back door to user-by-user liability. Treating a Commission finding of "insufficient mitigation" as automatic civil liability for every underlying post would punish platforms twice for the same content and erase the directive's careful separation between conduct and content.
What a Proportionate Path Looks Like
The DSA Transparency Database, live since September 2023, has already received billions of statements of reasons for content moderation decisions — a volume of disclosure with no equivalent anywhere else in the world. That data should drive any future reform: evidence first, statute second.
- Preserve Articles 6 and 8 as bright lines. Conditional immunity plus no general monitoring is not a loophole; it is the precondition for an open internet that includes European voices.
- Tier obligations to capacity. The DSA's VLOP/VLOSE designation already does this. Resist extending VLOP-grade duties to small and mid-size hosts where compliance cost simply closes the market to new entrants.
- Test "stay-down" narrowly. Equivalence orders should require judicial review of scope, clear time limits, and human-readable definitions. Algorithmic injunctions enforced ex ante are general monitoring by another name.
- Keep liability and due diligence separate. Articles 34–35 are administrative oversight; Article 6 is civil liability. Conflating them would re-litigate a settlement the co-legislators explicitly chose.
The Stakes
Europe is not short of regulatory firepower. Between the DSA, the Digital Markets Act, the AI Act, GDPR and the Cyber Resilience Act, the compliance perimeter for online services is now the densest in the world. The live policy question is not whether to regulate platforms — that battle is over — but whether to preserve the narrow but essential rule that hosts of user speech are not, by default, the speakers. That is the European Section 230. It is worth defending precisely because most people never notice it is there.