The European Commission's ongoing Digital Services Act (DSA) enforcement action against X is no longer a procedural skirmish. With preliminary findings already issued on three distinct alleged breaches — dark patterns around the blue verification checkmark, advertising transparency shortcomings, and inadequate researcher data access — the case has become the first real stress test of the EU's flagship platform law. The outcome will shape not just X's compliance posture but the practical meaning of intermediary liability across every Very Large Online Platform (VLOP) operating in Europe.
That stakes are high is an understatement. Under Article 74 of the DSA, the Commission can impose fines of up to 6% of a platform's global annual turnover for non-compliance. For X, even on conservative revenue estimates, that is a material number. More importantly, the precedents set in this case will define the floor for how the Commission interprets concepts that the text of the DSA left deliberately open-ended.
What the Commission Has Actually Alleged
The preliminary findings, first announced by the Commission in July 2024, cluster around three distinct issues:
- Dark patterns and the blue checkmark. The Commission has taken the view that X's paid verification system — where any user can buy a blue tick — deceives users who reasonably associate the mark with authenticated identity, a vestige of the pre-2022 verification regime.
- Advertising transparency. The Commission alleges X's ad repository does not meet the DSA's requirement that VLOPs maintain a searchable, comprehensive database enabling researchers and the public to scrutinise online advertising.
- Researcher data access. X is alleged to have failed to provide vetted researchers with the access required under Article 40 of the DSA, including by imposing what the Commission considers prohibitive API pricing.
X disputes each finding and is entitled to respond before any final non-compliance decision. None of the alleged breaches has been finally adjudicated.
Why the Blue Checkmark Question Matters Beyond X
Of the three issues, the dark patterns charge is the most legally novel — and the most consequential for the wider sector. The Commission is, in effect, telling a platform that a particular design choice constitutes a regulatory violation. That is a significant step. Dark patterns are addressed in Article 25 of the DSA, but the provision is framed in broad terms about manipulative interface design. Reading it to prohibit a specific monetisation feature is an interpretive choice with downstream consequences.
There is a perfectly reasonable consumer-protection argument here: paid verification that masquerades as identity verification can mislead users, particularly during elections or crises. But there is also a less reasonable risk — that EU enforcers begin substituting their product-design preferences for those of the platforms themselves, with finality conferred by the threat of multi-billion-euro fines. The DSA was sold to lawmakers as a transparency and due-process instrument, not a product regulator. Conflating the two would mark a meaningful expansion of intermediary liability.
The Researcher Access Fight Is the One to Watch
The most clearly pro-innovation reading of the DSA is its researcher access regime. Article 40 was designed to break open the black box of platform governance by giving vetted academics structured access to platform data. This is the kind of evidence-generating mechanism that good regulation depends on. If the Commission's case here succeeds, every VLOP will have a much clearer benchmark for what compliant data access looks like — and a strong incentive to standardise it rather than litigate.
The Commission's objection to X allegedly charging researchers significant sums for API access is well grounded in the text. Article 40's purpose is undermined if access is technically available but commercially prohibitive. A clean ruling on this point would be genuinely useful: it would clarify a thin part of the law without expanding intermediary liability in a way that chills product experimentation.
What Proportionate Enforcement Should Look Like
The EU has consistently said that the DSA is not about content moderation outcomes, but about systemic processes — risk assessments, transparency reporting, data access, and dispute resolution. That distinction is the firewall protecting freedom of expression from creeping regulatory pressure on platform speech rules. Preserving it requires discipline at the enforcement stage.
A 6% global turnover fine is a regulatory sledgehammer. Using it sparingly, and for clear process failures rather than contested design judgments, is what will determine whether the DSA is remembered as a transparency law or as a backdoor product regulator.
Three principles should guide the Commission as the X case advances:
- Severity must match the breach. Researcher access failures and missing ad repository fields are clear, technical, and remediable. Penalties should reflect that.
- Design choices deserve dialogue, not diktat. Where the issue is whether a particular feature crosses into manipulative territory, structured guidance and remediation timelines are more legitimate than headline fines.
- Predictability beats deterrence. VLOPs need to know what compliant looks like. A reasoned decision with concrete remediation requirements does more for the ecosystem than a record-breaking penalty.
The Bigger Picture
The X case is the first of several DSA enforcement tracks the Commission has opened, with proceedings also underway against TikTok, Meta, Temu, and AliExpress. How Brussels lands the X case — both the substance of the findings and the proportionality of the remedies — will frame the next decade of intermediary liability in Europe. A balanced outcome could entrench the DSA as the world's most credible platform-governance regime. An overreach could entrench the opposite: a perception that the EU's instinct is to fine first and reason later, with knock-on consequences for European platform investment and innovation. Getting this one right matters far beyond X.