EU dark pattern regulation

Brussels Doubles Down on Dark Patterns: Will the Digital Fairness Act Improve on the DSA — or Just Duplicate It?

The EU's proposed Digital Fairness Act targets manipulative design, but layering new rules on top of the DSA and UCPD risks regulatory overlap without proportionate gains.

The EU's Dark Pattern Regulatory Stack People of Internet Research · EU ~97% Sites with dark patterns From Commission's 2022 study of 75… 4 EU instruments already engaged DSA, UCPD, GDPR, and Consumer Righ… €210M CNIL cookie-pattern fines Combined Google and Facebook fines… 25 Designated VLOPs and VLOSEs Platforms under DSA's heaviest obl… peopleofinternet.com

Key Takeaways

The European Commission's push to advance a Digital Fairness Act (DFA) marks the EU's third major intervention in less than five years against so-called dark patterns — manipulative interface choices that nudge users toward decisions they would not freely make. Building on Article 25 of the Digital Services Act (DSA) and the Commission's 2024 Digital Fairness Fitness Check, the DFA promises a horizontal regime covering addictive design, personalised manipulation, and influencer marketing. The intent is sympathetic; the policy execution is where things get harder.

From a pro-innovation perspective, the question is not whether dark patterns are a problem — they plainly are — but whether yet another instrument is the right tool. The EU already prohibits manipulative interfaces under at least three overlapping regimes. Adding a fourth without carefully delineating its scope risks legal uncertainty, compliance duplication, and a chilling effect on legitimate design experimentation.

What the DFA Is Trying to Fix

The Commission's Digital Fairness Fitness Check, published in 2024, concluded that the existing consumer-protection acquis — the Unfair Commercial Practices Directive (UCPD), the Consumer Rights Directive, and the Unfair Contract Terms Directive — leaves "problematic practices" inadequately covered. Specifically, the Commission flagged:

The DFA, expected as a formal proposal in 2026, would establish substantive rules in these areas across all online services, not just the largest platforms already covered by DSA Article 25.

The Overlap Problem

The DSA's Article 25 already prohibits online platform operators from designing interfaces that "deceive or manipulate" users or otherwise materially distort their decision-making. The UCPD's prohibition on "misleading actions" and "aggressive commercial practices" — alive since 2005 — has been used by national authorities, including France's CNIL against Google and Facebook in 2022 over cookie-consent dark patterns, with fines totalling €210 million. The GDPR requires that consent be "freely given," which the European Data Protection Board has interpreted to bar manipulative consent flows.

The Commission's own 2022 behavioural study of 75 e-commerce websites and apps found that approximately 97% deployed at least one dark pattern. The data justifies action — but it also raises an awkward question: if dark patterns are nearly universal despite five layers of existing regulation, the answer may be better enforcement of current rules, not a sixth instrument.

Where the DFA Could Add Value

The DFA is most defensible where it tackles genuinely novel harms that current law strains to reach. Personalised manipulation — interfaces dynamically tuned to an individual user's inferred vulnerabilities — sits awkwardly under the UCPD's "average consumer" benchmark. Addictive design features aimed at minors raise concerns that the GDPR's age-verification provisions and the DSA's Article 28 only partially address. A targeted, narrowly drafted instrument focused on these gaps could meaningfully improve the regulatory landscape.

The risk, however, is scope creep. The Commission's preliminary impact-assessment work suggests it is considering broad bans on practices like "engagement-maximising recommendations" — language that could capture ordinary product design choices indistinguishable from helpful personalisation. Infinite scroll is a dark pattern when it traps a vulnerable teenager on a video feed at 2 a.m.; it is a reasonable UX choice for a search-results page or a maps application.

The Proportionality Test

Three principles should guide the DFA's drafting:

Innovation Implications

Europe's competitiveness gap with the US and China is, by the Commission's own admission in the Draghi report, the EU's defining economic challenge. The DSA, GDPR, AI Act, and DMA already form one of the world's most demanding regulatory stacks for digital services. A poorly scoped DFA layered on top would add genuine compliance cost for every European app developer — not just the dominant US platforms it is rhetorically aimed at.

Done well, the DFA can close real gaps left by the UCPD and DSA. Done badly, it becomes the EU's next cautionary tale in regulatory overlap. Brussels' challenge is to legislate against manipulation without inadvertently legislating against good design.

Sources & Citations

  1. Digital Services Act (Regulation 2022/2065) — Article 25 on dark patterns
  2. European Commission — Digital Fairness Fitness Check
  3. European Commission 2022 behavioural study on dark patterns in e-commerce
  4. CNIL — €210M in fines against Google and Facebook for cookie dark patterns (2022)
  5. European Commission — list of designated VLOPs and VLOSEs
Share this analysis: