EU section 230

Europe Has Its Own Section 230 — and It's Under Quiet Pressure

Article 6 of the EU's Digital Services Act is Europe's quiet equivalent of Section 230, and it deserves more defenders before the next reform cycle.

Europe's Intermediary Liability Stack at a Glance People of Internet Research · EU 6% Max DSA fine of a platform's global annual turn… 0 General monitoring duties allowed Article 8 DSA preserves the e-Comm… 26 Years of conditional safe harbor from the e-Commerce Directive (200… 25+ VLOPs and VLOSEs designated the only tier subject to systemic-… peopleofinternet.com

Key Takeaways

The Quiet Pillar Holding Up Europe's Internet

When Americans debate "Section 230," they usually mean the 26-word liability shield in §230(c)(1) of the U.S. Communications Decency Act. Europe has its own version, embedded in Article 6 of the Digital Services Act (Regulation (EU) 2022/2065) and inherited almost word-for-word from Article 14 of the e-Commerce Directive (2000/31/EC). Both rules say the same essential thing: a hosting service is not liable for illegal content posted by users until it has actual knowledge of that content and fails to act expeditiously. Both rules are quietly responsible for the existence of nearly every consumer-facing internet service Europeans use.

That foundation is now under sustained pressure on both sides of the Atlantic. In the United States, courts continue to probe the limits of Section 230 — most prominently in Moody v. NetChoice (2024), where the Supreme Court partially preserved platform editorial discretion against state "must-carry" laws. In Europe, the Commission's enforcement of the DSA against very large online platforms (VLOPs), including the formal proceedings opened against X in December 2023, is now defining what "expeditious removal" and "diligent" content moderation mean in practice.

What Article 6 Actually Says — and Doesn't

The DSA preserved the e-Commerce Directive's conditional liability exemption with only minor modernisation. Article 6(1) shields hosting providers from liability for user content if they (a) lack actual knowledge of illegality and (b) act expeditiously once notified. Crucially, Article 8 retains the prohibition on general monitoring obligations — Member States cannot require platforms to scan every upload pre-emptively for unlawful content.

This is the load-bearing wall. Remove it and the architecture of European hosting — from independent forums and Mastodon instances to Booking.com, Vinted, and user-uploaded podcasts — becomes commercially impossible. Smaller actors in particular cannot absorb the litigation cost of being treated as primary publishers of every user post.

Three Pressure Points in 2026

Stay-down obligations. The Court of Justice's 2019 ruling in Glawischnig-Piesczek v. Facebook Ireland (Case C-18/18) opened the door to injunctions requiring removal of "identical or equivalent" content, potentially worldwide, once a platform is on notice. The DSA codified specific notice-and-action rules in Article 16, but the operational meaning of "equivalent" remains contested. Over-broad equivalence orders shade into general monitoring by the back door and risk chilling lawful speech.

Algorithmic curation as editorial conduct. A growing line of argument — visible in some national enforcement actions — claims that ranked or recommended content falls outside Article 6 because the platform is "actively" presenting it. This conflates display with authorship. Recital 22 of the DSA was deliberately drafted to clarify that mere ranking, indexing or display does not, by itself, defeat the hosting exemption. Courts and regulators should resist redrawing that line through soft-law guidance.

Risk-based duties bleeding into liability. Articles 34 and 35 of the DSA require VLOPs to assess and mitigate "systemic risks" — enforced by fines of up to 6% of global turnover. These are due-diligence obligations, not a back door to user-by-user liability. Treating a Commission finding of "insufficient mitigation" as automatic civil liability for every underlying post would punish platforms twice for the same content and erase the directive's careful separation between conduct and content.

What a Proportionate Path Looks Like

The DSA Transparency Database, live since September 2023, has already received billions of statements of reasons for content moderation decisions — a volume of disclosure with no equivalent anywhere else in the world. That data should drive any future reform: evidence first, statute second.

The Stakes

Europe is not short of regulatory firepower. Between the DSA, the Digital Markets Act, the AI Act, GDPR and the Cyber Resilience Act, the compliance perimeter for online services is now the densest in the world. The live policy question is not whether to regulate platforms — that battle is over — but whether to preserve the narrow but essential rule that hosts of user speech are not, by default, the speakers. That is the European Section 230. It is worth defending precisely because most people never notice it is there.

Sources & Citations

  1. Digital Services Act — Regulation (EU) 2022/2065 (full text on EUR-Lex)
  2. e-Commerce Directive 2000/31/EC (EUR-Lex)
  3. CJEU, Glawischnig-Piesczek v. Facebook Ireland, Case C-18/18 (2019)
  4. U.S. Supreme Court, Moody v. NetChoice (2024)
  5. European Commission opens formal DSA proceedings against X (Dec 2023)
  6. DSA Transparency Database (statements of reasons)
Share this analysis: