EU intermediary liability

Brussels vs. X: Why the DSA's First Big Test Risks Codifying Regulation by Design Choice

The Commission's preliminary findings against X over blue checks, ad transparency, and researcher access mark a defining moment for EU intermediary liability.

The DSA Test Case People of Internet Research · EU 6% Max DSA fine Of a platform's global annual turn… 3 Preliminary breach areas Dark patterns, ad transparency, an… 5+ DSA proceedings open Including X, TikTok, Meta, Temu, A… ~25 Designated VLOPs Estimated count of designated very… peopleofinternet.com

Key Takeaways

The European Commission's ongoing Digital Services Act (DSA) enforcement action against X is no longer a procedural skirmish. With preliminary findings already issued on three distinct alleged breaches — dark patterns around the blue verification checkmark, advertising transparency shortcomings, and inadequate researcher data access — the case has become the first real stress test of the EU's flagship platform law. The outcome will shape not just X's compliance posture but the practical meaning of intermediary liability across every Very Large Online Platform (VLOP) operating in Europe.

That stakes are high is an understatement. Under Article 74 of the DSA, the Commission can impose fines of up to 6% of a platform's global annual turnover for non-compliance. For X, even on conservative revenue estimates, that is a material number. More importantly, the precedents set in this case will define the floor for how the Commission interprets concepts that the text of the DSA left deliberately open-ended.

What the Commission Has Actually Alleged

The preliminary findings, first announced by the Commission in July 2024, cluster around three distinct issues:

X disputes each finding and is entitled to respond before any final non-compliance decision. None of the alleged breaches has been finally adjudicated.

Why the Blue Checkmark Question Matters Beyond X

Of the three issues, the dark patterns charge is the most legally novel — and the most consequential for the wider sector. The Commission is, in effect, telling a platform that a particular design choice constitutes a regulatory violation. That is a significant step. Dark patterns are addressed in Article 25 of the DSA, but the provision is framed in broad terms about manipulative interface design. Reading it to prohibit a specific monetisation feature is an interpretive choice with downstream consequences.

There is a perfectly reasonable consumer-protection argument here: paid verification that masquerades as identity verification can mislead users, particularly during elections or crises. But there is also a less reasonable risk — that EU enforcers begin substituting their product-design preferences for those of the platforms themselves, with finality conferred by the threat of multi-billion-euro fines. The DSA was sold to lawmakers as a transparency and due-process instrument, not a product regulator. Conflating the two would mark a meaningful expansion of intermediary liability.

The Researcher Access Fight Is the One to Watch

The most clearly pro-innovation reading of the DSA is its researcher access regime. Article 40 was designed to break open the black box of platform governance by giving vetted academics structured access to platform data. This is the kind of evidence-generating mechanism that good regulation depends on. If the Commission's case here succeeds, every VLOP will have a much clearer benchmark for what compliant data access looks like — and a strong incentive to standardise it rather than litigate.

The Commission's objection to X allegedly charging researchers significant sums for API access is well grounded in the text. Article 40's purpose is undermined if access is technically available but commercially prohibitive. A clean ruling on this point would be genuinely useful: it would clarify a thin part of the law without expanding intermediary liability in a way that chills product experimentation.

What Proportionate Enforcement Should Look Like

The EU has consistently said that the DSA is not about content moderation outcomes, but about systemic processes — risk assessments, transparency reporting, data access, and dispute resolution. That distinction is the firewall protecting freedom of expression from creeping regulatory pressure on platform speech rules. Preserving it requires discipline at the enforcement stage.

A 6% global turnover fine is a regulatory sledgehammer. Using it sparingly, and for clear process failures rather than contested design judgments, is what will determine whether the DSA is remembered as a transparency law or as a backdoor product regulator.

Three principles should guide the Commission as the X case advances:

The Bigger Picture

The X case is the first of several DSA enforcement tracks the Commission has opened, with proceedings also underway against TikTok, Meta, Temu, and AliExpress. How Brussels lands the X case — both the substance of the findings and the proportionality of the remedies — will frame the next decade of intermediary liability in Europe. A balanced outcome could entrench the DSA as the world's most credible platform-governance regime. An overreach could entrench the opposite: a perception that the EU's instinct is to fine first and reason later, with knock-on consequences for European platform investment and innovation. Getting this one right matters far beyond X.

Sources & Citations

  1. European Commission: Preliminary findings on X under the DSA (July 2024)
  2. Digital Services Act — full text (EUR-Lex)
  3. European Commission: Supervision of the designated VLOPs and VLOSEs under the DSA
  4. Reuters: EU accuses Elon Musk's X of breaching tech rules
Share this analysis: