The European Commission's push to advance a Digital Fairness Act (DFA) marks the EU's third major intervention in less than five years against so-called dark patterns — manipulative interface choices that nudge users toward decisions they would not freely make. Building on Article 25 of the Digital Services Act (DSA) and the Commission's 2024 Digital Fairness Fitness Check, the DFA promises a horizontal regime covering addictive design, personalised manipulation, and influencer marketing. The intent is sympathetic; the policy execution is where things get harder.
From a pro-innovation perspective, the question is not whether dark patterns are a problem — they plainly are — but whether yet another instrument is the right tool. The EU already prohibits manipulative interfaces under at least three overlapping regimes. Adding a fourth without carefully delineating its scope risks legal uncertainty, compliance duplication, and a chilling effect on legitimate design experimentation.
What the DFA Is Trying to Fix
The Commission's Digital Fairness Fitness Check, published in 2024, concluded that the existing consumer-protection acquis — the Unfair Commercial Practices Directive (UCPD), the Consumer Rights Directive, and the Unfair Contract Terms Directive — leaves "problematic practices" inadequately covered. Specifically, the Commission flagged:
- Personalised dark patterns targeting individual psychological vulnerabilities;
- Addictive design features such as infinite scroll, autoplay, and engagement-maximising notifications;
- Manipulative influencer marketing and undisclosed commercial communications;
- Problematic practices in virtual environments and in-game purchases targeting minors.
The DFA, expected as a formal proposal in 2026, would establish substantive rules in these areas across all online services, not just the largest platforms already covered by DSA Article 25.
The Overlap Problem
The DSA's Article 25 already prohibits online platform operators from designing interfaces that "deceive or manipulate" users or otherwise materially distort their decision-making. The UCPD's prohibition on "misleading actions" and "aggressive commercial practices" — alive since 2005 — has been used by national authorities, including France's CNIL against Google and Facebook in 2022 over cookie-consent dark patterns, with fines totalling €210 million. The GDPR requires that consent be "freely given," which the European Data Protection Board has interpreted to bar manipulative consent flows.
The Commission's own 2022 behavioural study of 75 e-commerce websites and apps found that approximately 97% deployed at least one dark pattern. The data justifies action — but it also raises an awkward question: if dark patterns are nearly universal despite five layers of existing regulation, the answer may be better enforcement of current rules, not a sixth instrument.
Where the DFA Could Add Value
The DFA is most defensible where it tackles genuinely novel harms that current law strains to reach. Personalised manipulation — interfaces dynamically tuned to an individual user's inferred vulnerabilities — sits awkwardly under the UCPD's "average consumer" benchmark. Addictive design features aimed at minors raise concerns that the GDPR's age-verification provisions and the DSA's Article 28 only partially address. A targeted, narrowly drafted instrument focused on these gaps could meaningfully improve the regulatory landscape.
The risk, however, is scope creep. The Commission's preliminary impact-assessment work suggests it is considering broad bans on practices like "engagement-maximising recommendations" — language that could capture ordinary product design choices indistinguishable from helpful personalisation. Infinite scroll is a dark pattern when it traps a vulnerable teenager on a video feed at 2 a.m.; it is a reasonable UX choice for a search-results page or a maps application.
The Proportionality Test
Three principles should guide the DFA's drafting:
- Behaviour, not features. Outlaw demonstrable manipulation and psychological exploitation. Do not ban entire design primitives — autoplay, notifications, scrolling — that have legitimate, user-welfare-enhancing uses.
- Risk-proportionate obligations. The DSA's tiered approach — heavier duties for Very Large Online Platforms, lighter ones for everyone else — works. A blanket horizontal regime would crush smaller European startups that lack DSA-scale compliance teams. Only 25 VLOPs and VLOSEs are currently designated; the regulatory perimeter for SMEs should remain meaningfully lighter.
- Enforce what exists first. The Commission's own 2024 fitness-check report notes uneven enforcement of UCPD and consumer rules across Member States. Harmonised enforcement guidelines and better Consumer Protection Cooperation network coordination would address much of the problem the DFA targets — without new legislation.
Innovation Implications
Europe's competitiveness gap with the US and China is, by the Commission's own admission in the Draghi report, the EU's defining economic challenge. The DSA, GDPR, AI Act, and DMA already form one of the world's most demanding regulatory stacks for digital services. A poorly scoped DFA layered on top would add genuine compliance cost for every European app developer — not just the dominant US platforms it is rhetorically aimed at.
Done well, the DFA can close real gaps left by the UCPD and DSA. Done badly, it becomes the EU's next cautionary tale in regulatory overlap. Brussels' challenge is to legislate against manipulation without inadvertently legislating against good design.