EU intermediary liability

The DSA's First Big Test: Why the X Case Will Define Europe's Intermediary Liability Era

Brussels' preliminary findings against X are less about one platform than about whether the Digital Services Act can regulate design without policing speech.

The DSA's First Big Enforcement Test People of Internet Research · EU 6% Max DSA fine Of global annual turnover for seri… 3 Alleged breaches Blue ticks, ad repository, researc… 45M VLOP user threshold Monthly active EU users triggers V… 2023 DSA in force for VLOPs Obligations began applying in Augu… peopleofinternet.com

Key Takeaways

The European Commission's preliminary findings against X — first issued in July 2024 and now grinding through the next phase of formal proceedings — have become the defining stress test of the Digital Services Act (DSA). For the first time since the DSA's obligations on Very Large Online Platforms (VLOPs) took effect in August 2023, Brussels is attempting to translate the regulation's sweeping text into a concrete enforcement decision against a single company. How the Commission handles this case will shape the next decade of intermediary liability law in Europe — and, by extension, globally.

What the Commission Actually Alleges

The preliminary findings, announced by then-Commissioner Thierry Breton, identified three areas where X is said to fall short of its DSA obligations:

Critically, none of these findings target what X hosts. They target how the platform is designed and how transparent it is about its own operations. That distinction matters enormously.

A Welcome Shift Away From Speech Policing

For all the heat the DSA has generated, the X case demonstrates something the regulation's defenders have long argued: at its best, the DSA is not a content-moderation mandate. It is a transparency and due-process regime. The Commission is not telling X what posts to remove. It is asking the company to be honest about who is verified, who pays for ads, and how outsiders can audit its systemic effects.

That is a legitimate and proportionate use of intermediary liability law. The pre-DSA debate in Europe often slid toward 'notice-and-staydown' obligations and aggressive removal mandates that risked over-blocking lawful speech. The DSA — at least on paper — moved the centre of gravity to procedural fairness, risk assessments and researcher access. The X proceeding is the first real chance to anchor that interpretation in case law.

Where the Commission Risks Overreach

That does not mean the case is without dangers. Three in particular deserve scrutiny from anyone who cares about a competitive, innovative internet.

1. Design choices are not always deception

The Commission's 'dark pattern' theory on blue checks is novel. Reasonable people can disagree about whether monetising verification badges is misleading or simply a different — and openly disclosed — product choice. Treating any departure from a prior industry convention as a deceptive design risks freezing platform UX in 2022 amber. Regulators should distinguish between hidden manipulation and visible, contested product decisions.

2. Process risks for non-incumbents

X is a uniquely scrutinised platform, but the precedents set here will apply to every VLOP — including smaller European challengers that may one day cross the 45-million-user threshold. If the Commission can pursue multi-year proceedings with fines of up to 6% of global turnover over interface design, smaller entrants will rationally route around the EU or remain below the VLOP threshold. That is the opposite of what European digital sovereignty advocates say they want.

3. Researcher access must be operationalised, not litigated

Article 40 is one of the DSA's most promising provisions, but the implementing delegated act — adopted only in 2025 — leaves significant uncertainty around vetting, costs and liability. Suing platforms over inadequate access while the framework is still being defined puts the cart before the horse. The Commission and EDMO should publish clear technical templates before any final infringement decision lands.

What a Proportionate Outcome Looks Like

A sensible resolution would lock in the case's transparency wins while avoiding speech-adjacent escalation. That means:

The DSA was sold to European citizens as a rulebook that would make platforms more accountable without turning Brussels into a speech regulator. The X case is the moment that promise is tested. If the Commission lands a clean, narrowly-drawn enforcement decision focused on design transparency and researcher access, the DSA will emerge stronger and the open internet will be better for it. If the case drifts into punishing a platform for being editorially unpopular, every future enforcement action — against TikTok, Meta, Temu or a future European champion — will inherit that political baggage.

Intermediary liability done right is boring, procedural and predictable. That is exactly what Europe should aim for here.

Sources & Citations

  1. European Commission: Preliminary findings on X under the DSA (July 2024)
  2. Digital Services Act — full text (Regulation 2022/2065)
  3. Reuters: EU accuses X of breaching online content rules
  4. European Commission: Formal proceedings opened against X (December 2023)
Share this analysis: