Nigeria sits at a familiar crossroads. Africa's largest economy is also one of its most vibrant digital markets — home to a thriving creator class, a fintech sector that has minted multiple unicorns, and a political conversation that has migrated almost entirely onto social platforms. Yet the legal scaffolding that governs online speech in Nigeria remains structurally tilted toward enforcement, with the Cybercrimes (Prohibition, Prevention, etc.) Act, 2015 and a thicket of Nigerian Communications Commission (NCC) directives continuing to chill the very speech that gives the country's digital economy its energy.
A recent Access Now convening, Advancing Rights-Centered Reporting on Nigeria's Cybercrimes Act, is the latest reminder that civil society, lawyers, and journalists are still wrestling with a statute that — even after a 2024 amendment — leaves too much discretion in the hands of police and prosecutors. For a country that wants to be taken seriously as a digital hub, that is a problem worth fixing properly.
Section 24: the provision that won't die
The most contested provision in the Act is Section 24, which criminalises sending messages via a computer system that are, among other things, “grossly offensive” or that the sender knows to be false, “for the purpose of causing annoyance, inconvenience, danger, obstruction, insult, injury, criminal intimidation, enmity, hatred, ill will or needless anxiety.” In practice, that language has been used to arrest bloggers, journalists, and ordinary social-media users for posts criticising public officials.
In March 2022, the ECOWAS Community Court of Justice ruled in Incorporated Trustees of Laws and Rights Awareness Initiative v. Federal Republic of Nigeria that Section 24 was inconsistent with Article 9 of the African Charter on Human and Peoples' Rights and ordered Nigeria to amend it. Lagos finally responded with the Cybercrimes (Amendment) Act, 2024, which narrowed Section 24's scope — focusing it on messages that are “pornographic” or that the sender knows to be false and that cause a “breakdown of law and order, posing a threat to life, or causing such persons to lose his/her properties.”
That is real progress. But human-rights groups, including the Media Rights Agenda and Paradigm Initiative, have documented continued arrests under the revised provision, and the new language still hinges on subjective tests like “breakdown of law and order” that police can stretch to fit almost any critical post. A law that depends so heavily on prosecutorial restraint is not, in practice, a narrowly tailored law.
The NCC's expanding remit
Layered on top of the Cybercrimes Act is the NCC's increasingly assertive posture toward online platforms and over-the-top (OTT) services. The Commission's lawful interception regulations, SIM-NIN linkage enforcement, and the 2022 Code of Practice for Interactive Computer Service Platforms and Internet Intermediaries (jointly issued with NITDA) impose takedown timelines, local-representative requirements, and traceability obligations on platforms operating in Nigeria.
Nigeria's 2021 suspension of Twitter — which lasted roughly seven months and was lifted in January 2022 after the platform agreed to register locally and appoint a country representative — remains the cautionary tale. Independent estimates from NetBlocks and Top10VPN suggested the shutdown cost the Nigerian economy hundreds of millions of dollars, with small businesses and freelancers bearing the brunt. The lesson should have been that platform regulation works best when it is predictable, rule-based, and proportionate. Instead, the threat of executive action against platforms has become a recurring feature of Nigerian internet policy.
What proportionate regulation would look like
None of this is an argument for regulatory abstention. Nigeria faces genuine harms online — financial fraud, non-consensual intimate imagery, ethno-religious incitement, and election-period disinformation are all real. A modern cybercrime framework is necessary. But the design choices matter:
- Specificity over vagueness. Speech offences should be tied to concrete, demonstrable harms — incitement to imminent violence, fraud, or non-consensual disclosure — not to elastic terms like “annoyance” or “ill will.”
- Judicial oversight of takedowns. Content removal orders should require a court, not an agency directive, except in narrow emergency categories (CSAM, imminent violence).
- Proportional penalties. Custodial sentences for speech offences invert the proportionality principle the African Charter requires.
- Transparent intermediary rules. The 2022 Code of Practice should be put on a statutory footing through the National Assembly, with clear due-process rights for platforms and users, rather than enforced through informal pressure.
The growth case for getting this right
Nigeria's digital economy contributed roughly 18% of GDP in recent quarters, according to figures published by the National Bureau of Statistics, and the country's startup ecosystem has attracted billions of dollars in venture capital over the last five years. That growth depends on a regulatory environment that creators, investors, and platforms can plan around. Every high-profile arrest under Section 24, and every threat of an NCC-led shutdown, is a tax on that confidence.
The 2024 amendment showed that Nigeria's lawmakers can move when international courts and domestic civil society apply sustained pressure. The next step is to finish the job: re-draft Section 24 around objectively verifiable harms, codify intermediary obligations through Parliament rather than agency fiat, and commit publicly that platform shutdowns will not recur. That is the path to an internet policy worthy of Africa's largest digital market — and one that treats Nigerians as citizens to be protected, not suspects to be policed.