When Access Now convenes its May 2026 workshop on rights-centered reporting on Nigeria's Cybercrimes Act, the conversation will turn on a question that has dogged Africa's largest democracy for over a decade: how do you regulate online harms without giving the state a tool to silence its critics? Since the Cybercrimes (Prohibition, Prevention, etc.) Act was passed in 2015, civil society groups have documented more than two dozen cases in which journalists, human rights defenders and whistleblowers have been arrested, detained or prosecuted under its provisions — most often under the elastic offence of "cyberstalking" in Section 24.
That track record matters not only for Nigerian press freedom, but for a much broader debate now playing out across jurisdictions: what does a transparent, due-process-respecting platform-and-cybercrime framework actually look like? The European Union's Digital Services Act has set one benchmark — mandatory transparency reports, structured risk assessments and public databases of takedown orders. Nigeria's regime, by contrast, still operates largely in the dark.
A law written for fraud, used against speech
The Cybercrimes Act was drafted primarily to combat the financial fraud, identity theft and unauthorised system access that genuinely harm Nigerian consumers and businesses. Those are legitimate, even urgent, regulatory goals. The problem is Section 24, which criminalises messages sent via a computer system that are "grossly offensive," "pornographic," or known to be false and sent for the purpose of "causing annoyance, inconvenience, danger, obstruction, insult, injury, criminal intimidation, enmity, hatred, ill will or needless anxiety."
That language is constitutionally vague by almost any standard. In 2022, the ECOWAS Community Court of Justice ruled in SERAP v. Federal Republic of Nigeria that Section 24 was incompatible with Nigeria's obligations under Article 9 of the African Charter on Human and Peoples' Rights and Article 19 of the ICCPR, and ordered the law amended. A 2024 amendment narrowed the provision but, critics argue, did not eliminate its chilling effect — prosecutions have continued, often beginning with arrests by police or the Department of State Services before any judicial review of the underlying speech.
What DSA-style transparency would change
The contrast with the EU Digital Services Act is instructive. Under the DSA, very large online platforms and search engines must publish biannual transparency reports detailing every government order they receive, every piece of content they remove, the legal basis cited, and turnaround times. The European Commission also runs a public DSA Transparency Database where statements of reasons for content moderation decisions are searchable in near real time. Whatever one thinks of the DSA's substantive obligations — and there is plenty to critique on overreach grounds — its reporting architecture has done something genuinely useful: it has made state and platform behaviour legible to journalists, researchers and ordinary users.
Nigeria has no comparable mechanism. There is no public registry of takedown demands issued under the Cybercrimes Act, no aggregated data on Section 24 arrests, no requirement that platforms operating in Nigeria disclose how many user-data requests they receive from Nigerian authorities or how they respond. Civil society organisations like Paradigm Initiative, the Media Rights Agenda and the Committee to Protect Journalists have had to build case-by-case databases through painstaking field reporting — work that should not be left to under-resourced NGOs.
The pro-innovation case for transparency
It is sometimes assumed that transparency obligations are a brake on innovation and a burden on platforms. The opposite is closer to the truth in jurisdictions where enforcement is discretionary and opaque. Platforms operating in Nigeria today face significant legal uncertainty: they cannot easily predict which content will trigger a state demand, cannot benchmark their compliance posture against peers, and cannot defend themselves to users when removals happen quietly. Investors and operators alike benefit from clear, published rules and aggregate disclosure of how those rules are applied.
Nigeria's tech sector — one of the most dynamic on the continent, with a thriving fintech, creator-economy and mobile-content ecosystem — has a direct stake in this. A regulatory environment in which a journalist can be arrested for a tweet is also one in which a startup founder, a satirist, or a whistleblowing employee can be. The chilling effect is general, not targeted.
A pragmatic path forward
A proportionate reform agenda would not require importing the DSA wholesale. Three concrete steps would meaningfully narrow the gap:
- Statutory transparency reporting by the Nigerian Communications Commission and the Office of the National Security Adviser, publishing aggregate numbers of orders issued under the Cybercrimes Act, broken down by provision invoked and outcome.
- A judicial pre-clearance requirement for arrests under speech-related provisions of the Act, so that a magistrate reviews the underlying material before a journalist is taken into custody — a basic due-process safeguard.
- A public statement-of-reasons database for platform takedowns executed in response to Nigerian government orders, modelled on the DSA Transparency Database but lightweight enough to be operable by smaller platforms.
None of these reforms would weaken Nigeria's ability to prosecute genuine cybercrime — the fraud, hacking and child-protection provisions of the Act are not in dispute. What they would do is align Nigeria's framework with the transparency floor that is rapidly becoming the global norm, and give the country's vibrant online public sphere room to breathe. The Access Now workshop is a welcome step in keeping that conversation grounded in evidence rather than ideology.