EU intermediary liability

Brussels Draws First Blood Under Article 73: Why the X Preliminary Findings Will Define DSA Enforcement

The Commission's first formal Article 73 findings against X mark a milestone for DSA enforcement — but proportionality, due process, and innovation must guide what comes next.

The DSA's First Article 73 Case: By the Numbers People of Internet Research · EU 6% Max fine of global turnover Maximum penalty available under Ar… 3 DSA articles cited in case Findings reference Articles 25, 39… ~29 Months from probe to findings Proceedings against X were opened … 25+ VLOPs designated under DSA All face the same enforcement fram… peopleofinternet.com

Key Takeaways

On May 6, 2026, the European Commission issued its first-ever formal preliminary findings under Article 73 of the Digital Services Act (DSA), concluding that X (formerly Twitter) has breached obligations on dark patterns in its verified-account system, advertising transparency, and data access for vetted researchers. The decision marks a watershed moment: nearly two years after opening proceedings in December 2023, the Commission has formally moved a Very Large Online Platform (VLOP) case from investigation to a quasi-adversarial enforcement phase. Now comes the harder question — not whether the DSA will be enforced, but how.

For a regulation premised on transparency and accountability rather than top-down content rules, this is exactly the kind of test case Brussels needed. It is also the kind that could quietly redefine what intermediary liability looks like in Europe for the next decade.

What the Commission Found

The preliminary findings, summarised in the Commission's press release, focus on three discrete obligations rather than content moderation outcomes:

Critically, the Commission did not find that X has failed in content moderation against illegal content — the politically charged terrain that dominates public debate. Article 73 here is being used to enforce process and transparency obligations, which is precisely the kind of proportionate, evidence-driven enforcement the DSA's defenders promised.

A Procedural Milestone, Not a Verdict

It is essential to be precise about what preliminary findings actually are. Under Article 73, they trigger X's right of defence: the company can access the file, respond in writing, and request an oral hearing before the Commission considers a non-compliance decision. Only after that adversarial phase can fines of up to 6% of global annual turnover be imposed under Article 74, alongside potential periodic penalty payments under Article 76.

This sequencing matters. The DSA was designed to be a regulation by structured cooperation, not a content tribunal. Issuing findings is not the same as issuing fines — and treating the May 6 announcement as a foregone conclusion risks normalising a presumption-of-guilt posture that is incompatible with EU administrative due process. The Commission itself acknowledged that "X now has the rights of defence."

Why Proportionality Will Be the Real Test

From a pro-innovation standpoint, three issues should shape what comes next:

1. Calibrate remedies to actual harm

Dark-pattern findings on a verification badge are qualitatively different from systemic risks of large-scale illegal content. Any remedy — and any fine — should reflect that. A maximum penalty applied to a transparency-shaped breach would distort incentives across the entire VLOP ecosystem and chill experimentation with monetisation models that, while imperfect, are part of how platforms remain financially viable outside the surveillance-advertising paradigm the EU itself has criticised.

2. Don't let researcher-access enforcement become a backdoor liability regime

Article 40 is one of the DSA's most important innovations — opening platform data to vetted researchers is a public good. But the implementing rules, finalised in 2025, are still bedding in. Aggressive enforcement before the technical and contractual norms have stabilised could push platforms toward minimal-compliance APIs that are technically open but practically useless. Brussels and platforms should agree on baseline reference architectures rather than litigate each access dispute.

3. Preserve the firewall between conduct rules and speech rules

The strongest argument for the DSA — and the reason it deserved cautious support over more interventionist alternatives — was that it regulates systems rather than speech. The Commission's decision to anchor its first Article 73 case in dark patterns, ad transparency, and researcher access keeps that firewall intact. Future cases must do the same. Slipping into adjudication of moderation outcomes would re-open the intermediary liability settlement that Article 6 DSA (the safe harbour, inherited from the e-Commerce Directive) was meant to preserve.

The Global Signal

Other jurisdictions are watching closely. India is finalising its Digital India Act; Brazil's PL 2630 debate is reawakening; the UK's Online Safety Act is in its first enforcement cycle under Ofcom. If Europe demonstrates that a transparency-led regime can be enforced firmly but proportionately — with fines calibrated to harm and procedures honouring the rights of defence — the DSA will become an exportable template. If, instead, the X case becomes a maximalist precedent, the message global regulators take home will be very different: that the path of least resistance is content control by another name.

The DSA's credibility hinges on what happens between now and the final decision. Enforcement, yes. Proportionality, always. And a clear-eyed recognition that an open internet still depends on intermediaries that can take risks, evolve their products, and answer to law — but not be governed by it line by line.

Sources & Citations

  1. European Commission — Preliminary findings under DSA against X (Article 73)
  2. Digital Services Act — Regulation (EU) 2022/2065 (full text)
  3. European Commission — DSA enforcement and supervision overview
  4. European Commission — Proceedings opened against X (December 2023)
  5. Delegated Regulation on data access for vetted researchers (Article 40 DSA)
Share this analysis: