Kenya's Competition Authority (CAK) is increasingly turning its attention to a thorny corner of the digital economy: the deceptive design choices — "dark patterns" — that nudge consumers into purchases, subscriptions, data disclosures and high-cost loans they did not actually want. Under the Competition Act (Cap. 504), CAK has both consumer-protection and antitrust mandates, and its leadership has signalled that interfaces on e-commerce marketplaces and digital lending apps will sit squarely within the scope of its digital-markets work.
That is, in principle, welcome. Pre-ticked consent boxes, hidden cancellation flows, fake countdown timers and misleading APR displays are not free speech or creative UX — they are misrepresentations that distort the consumer choice consumer-protection law has always tried to protect. The harder question is how Kenya regulates these practices without flattening the legitimate experimentation that has made Nairobi one of Africa's most consequential digital hubs.
What CAK is actually looking at
CAK's consumer-protection division has, in recent enforcement cycles, taken on misleading advertising, unfair contract terms, and predatory lending features. The Digital Credit Providers regime — operated jointly with the Central Bank of Kenya since 2022 under amendments to the Central Bank of Kenya Act — has already pushed dozens of lenders out of the market for opaque pricing and aggressive debt collection. Layering a dark-patterns lens on top of this means scrutinising not just what a fintech tells a borrower, but how the interface is built to obtain that consent.
This is the right diagnosis. The most documented consumer harms in Kenya's app economy are concentrated in lending and in cross-border e-commerce: undisclosed fees, default opt-ins to data sharing with third-party collectors, "continue" buttons that quietly trigger insurance add-ons, and friction-laden cancellation flows. Borrowers, not bargain-hunters, are where the real damage shows up.
Learn from the global enforcement record — including its mistakes
Kenya is arriving late to dark-patterns regulation, which is an advantage. The international evidence base is now thick enough to separate effective interventions from theatre.
- The US FTC's Negative Option Rule and its 2023 "click to cancel" case against Amazon Prime targeted a specific, measurable harm: subscription enrolment flows that took one click while cancellation took several screens. Narrow, conduct-based, evidence-driven.
- The EU's Digital Services Act (Article 25) bans dark patterns on online platforms but explicitly defers to existing consumer-protection rules where they already cover the conduct — a deliberate choice to avoid double regulation.
- India's Guidelines for Prevention and Regulation of Dark Patterns (2023), by contrast, enumerate 13 specific practices but rely largely on the Consumer Protection Act's existing machinery rather than creating a new licensing regime. Enforcement, not novelty, is what moves the needle.
- The OECD's 2022 report on dark commercial patterns warned regulators against rules so broad they capture ordinary persuasive design — recommended-product carousels, urgency cues tied to real inventory, default settings that genuinely match user preferences.
The pattern across these regimes is consistent: specificity wins. Banning "manipulation" in the abstract produces compliance theatre and litigation; banning enumerated practices — pre-ticked boxes, false scarcity, obstructed cancellation, disguised ads — produces measurable consumer benefit.
Why a heavy hand would hurt Kenya more than most
Kenya's digital economy is not Silicon Valley's. It is a thinner, more fragile ecosystem of startups operating on slim margins, where a single ambiguous enforcement action can determine whether a Series A closes. The country's fintech and e-commerce sector still depends heavily on imported infrastructure — a vulnerability highlighted again this week when the National Treasury proposed extending 16% VAT to imported EVs, lithium batteries and e-bikes under the Finance Bill 2026, a reminder that policy shifts in Nairobi land hard on companies with no domestic alternative.
An over-broad dark-patterns rule would compound that fragility in three ways:
- Compliance overhead falls hardest on smaller players. A Jumia or Safaricom can hire UX-compliance counsel; a Series-seed marketplace cannot.
- Definitional uncertainty chills A/B testing — the iterative experimentation that distinguishes well-designed African products from imported templates.
- Enforcement discretion without a closed list of prohibited practices becomes a tool that can be wielded selectively against politically inconvenient platforms.
A proportionate Kenyan model
CAK should resist the temptation to write a stand-alone Dark Patterns Code. Kenya does not need one. What it needs is a published enforcement guideline under the existing Competition Act and the Consumer Protection Act (No. 46 of 2012), doing four things:
- Enumerate a closed list of prohibited practices — pre-ticked consent, obstructed cancellation, false urgency, hidden material terms, sneak-into-basket — borrowed from the FTC and Indian guidelines.
- Carve out ordinary persuasive design: real scarcity, genuine social proof, defaults that demonstrably match user welfare.
- Sequence enforcement by harm severity: lending apps first (where evidence of consumer injury is strongest and overlaps with CBK's mandate), e-commerce next, ad-supported platforms last.
- Require interface evidence — screenshots, click-path data — before opening a formal investigation, to discipline complaint quality.
Done this way, Kenya can do what it has historically done well in financial regulation: write a rule that is narrower than the EU's, sharper than India's, and credible enough that compliant firms see it as a moat rather than a tax. The aim is not to make Kenyan apps look like German ones. It is to make sure the consumer who tapped "agree" actually meant to.