US section 230

Section 230 at 30: Why Community Moderation Still Needs Federal Cover

EFF's new analysis of Reddit's volunteer moderators is a timely reminder that gutting Section 230 would break the user-driven web Congress claims to want.

Section 230 at 30: The Scale of User Moderation People of Internet Research · US 30 Years since Section 230 Enacted as part of the Communicati… ~100K Active Reddit communities Each governed by its own volunteer… 2024 Year of NetChoice ruling Supreme Court confirmed moderation… Few FOSTA prosecutions reviewed GAO-21-385 found limited federal u… peopleofinternet.com

Key Takeaways

Thirty years after Congress passed 47 U.S.C. § 230 as part of the Communications Decency Act of 1996, the law is once again in the crosshairs. A bipartisan crop of bills — from sunset proposals that would let Section 230 expire entirely, to carve-outs targeting algorithmic ranking, AI-generated content, and alleged 'censorship' of lawful speech — has put the foundational liability shield for user-generated content back on the legislative agenda. Into this debate the Electronic Frontier Foundation has launched a refreshingly concrete series: The Internet Still Works. Its April 2026 installment, Reddit Empowers Community Moderation, makes a case that policymakers keep missing: Section 230 is not a corporate giveaway to Big Tech — it is the legal scaffolding that lets ordinary users govern their own online communities.

What EFF's Reddit case study actually shows

EFF's analysis focuses on how Reddit's roughly 100,000 subreddits are moderated almost entirely by unpaid volunteers who write their own rules, remove off-topic posts, and ban bad actors. That model — millions of independent moderation decisions made by people who care about a specific community — is only possible because Section 230(c)(1) treats the platform as a publisher of neither the original user's post nor the moderator's takedown. Without that shield, every removed comment becomes a potential defamation suit against Reddit, and every left-up comment becomes a potential lawsuit too. The predictable result, as EFF notes, would be the death of community-led moderation in favor of cautious, centralized, lawyer-driven takedowns.

The same logic applies far beyond Reddit. EFF's broader series highlights Wikipedia's volunteer editors, Yelp's reviewer community, and SmugMug's photographer-curated galleries — none of which look like the algorithmic feeds that dominate Section 230 hearings on Capitol Hill, but all of which depend on the same liability rule.

The reform proposals on the table

Several pending US proposals would meaningfully change how Section 230 operates:

The common thread is that these reforms are designed for a mental model of the internet built around three or four large feeds. They map poorly onto the long tail of forums, wikis, hobbyist sites, fediverse instances, and small-business review platforms where most user-generated content actually lives.

Why proportionate reform is harder than it looks

People of Internet has consistently supported targeted, evidence-based intervention on specific harms — non-consensual intimate imagery, CSAM, fraud — and Section 230 already does not shield platforms from federal criminal liability or from intellectual property claims. Existing carve-outs like FOSTA-SESTA (2018), which removed immunity for content facilitating sex trafficking, offer a cautionary tale: a Government Accountability Office review (GAO-21-385) found the law had been used in very few prosecutions while pushing consensual sex workers off mainstream platforms onto less safe alternatives. The lesson is not that targeted reform is impossible, but that broad rewrites tend to produce collateral damage that falls hardest on smaller platforms and marginalized users.

The First Amendment is doing real work too

It is worth remembering that Section 230 sits on top of, not in place of, the First Amendment. In Moody v. NetChoice, the Supreme Court held that platforms' content-moderation choices are themselves protected editorial activity. That means even if Section 230 were repealed tomorrow, many of the lawsuits its critics envision — particularly 'must-carry' claims against private platforms — would still fail on constitutional grounds. But the procedural cost of litigating each case to that conclusion would be ruinous for anyone smaller than Meta or Google. Section 230's value is precisely that it lets cases get dismissed at the pleading stage, before discovery bankrupts a hobbyist forum or a regional news site's comments section.

What good Section 230 policy looks like

A proportionate path forward would:

EFF's Reddit piece is valuable because it grounds an abstract legal debate in the lived experience of a moderator deciding whether to remove a rule-breaking post. That decision happens millions of times a day on the open internet. Section 230 is what makes saying 'yes, remove it' and 'no, leave it up' both legally survivable. As Congress weighs its next move, lawmakers should ask not whether the law is perfect, but whether the alternative — a web where every moderation call is a potential lawsuit — is one they actually want to live in.

Sources & Citations

  1. EFF: The Internet Still Works — Reddit Empowers Community Moderation (April 2026)
  2. 47 U.S.C. § 230 — Cornell LII
  3. Moody v. NetChoice, LLC — Supreme Court opinion (2024)
  4. GAO-21-385: Sex Trafficking — Online Platforms and Federal Prosecutions
Share this analysis: