Indonesia is once again at the leading edge of a regulatory experiment that will shape platform liability across the Global South. Under Minister Meutya Hafid, the Ministry of Communication and Digital Affairs — restructured and rebranded from Kominfo to Komdigi in late 2024 — has been steadily tightening enforcement of Ministerial Regulation MR5/2020 (Permenkominfo 5/2020) and the newer child-protection instrument PP No. 17/2025. Meta, X, Telegram and TikTok have all been on the receiving end, with the ministry publicly pressing them to remove online gambling, pornography and content harmful to minors within tight deadlines, and openly threatening access blocking for non-compliant Private Scope Electronic System Operators (PSE Lingkup Privat).
This is not the first time Jakarta has wielded MR5. In 2022, Komdigi's predecessor briefly blocked PayPal, Yahoo, Steam, Epic Games and Origin for missing registration deadlines under the same regulation — a move that triggered domestic outrage and forced an embarrassing rollback within days. The new round of enforcement is more targeted, but the underlying legal architecture is the same, and it deserves close scrutiny from anyone who cares about a free, open and economically dynamic internet in Southeast Asia's largest economy.
What MR5 Actually Requires
MR5/2020 obliges any digital service offered to Indonesian users — whether Indonesian-incorporated or not — to register as a PSE, designate a local point of contact, and grant the government a privileged channel to demand content removal. The numbers that matter for platforms are the takedown clocks:
- Four hours for content the ministry classifies as urgent — typically terrorism, child sexual abuse material, or content deemed to disturb public order.
- Twenty-four hours for all other unlawful content, including online gambling and pornography, which Indonesian law treats far more strictly than most jurisdictions.
PP No. 17/2025, the implementing regulation on child protection in the digital ecosystem, layers additional duties on top: age-appropriate design, risk assessments, and stricter handling of content reaching minors. On paper, much of this aligns with the global drift toward duty-of-care regimes — the EU's Digital Services Act, the UK's Online Safety Act, Australia's online safety codes. In practice, MR5's combination of vague content categories, ministerial discretion and access-blocking as the default sanction is closer to a compliance hostage situation than a calibrated liability regime.
Why Short-Fuse Takedowns Are a Bad Default
There is a legitimate state interest in removing CSAM and incitement quickly, and platforms have built genuinely impressive abuse-handling pipelines for those categories. But the universe of "unlawful content" Komdigi can flag under MR5 is far broader, and the four-hour and 24-hour clocks apply uniformly. Several problems follow.
First, speed is the enemy of accuracy. The European Commission's own evaluation work on the Terrorist Content Online Regulation, the OECD's Transparency Reporting on Terrorist and Violent Extremist Content Online (2022), and successive Santa Clara Principles updates have all converged on the same finding: ultra-short removal windows produce systematic over-removal, with lawful political speech, journalism and counter-speech disproportionately swept up. Indonesia, a vibrant democracy with a contentious public sphere, has more to lose from that dynamic than most.
Second, ministerial blocking-as-sanction collapses the distinction between a single piece of unlawful content and an entire service. When the only credible threat is to cut off access to TikTok or X for 100+ million Indonesian users, every notice carries the implicit weight of a market-exit order. That asymmetry inevitably distorts platform decisions toward removal, and it gives the executive branch a tool that — in less restrained hands — could be turned against political dissent or journalism.
A Better Path: Proportionality, Process, Transparency
None of this means Indonesia should abandon platform regulation. The case for clearer rules around child safety, fraud and gambling is strong, and platforms benefit from predictable obligations. But Komdigi could meaningfully de-risk MR5 and PP 17/2025 with a few proportionality-minded reforms:
- Tier the deadlines: keep four-hour windows for narrowly defined CSAM and credible threats to life, but extend timelines for ambiguous categories like "public order" content to allow human review.
- Build in independent appeal: a fast administrative-court or ombudsman channel for contested takedowns, not just bilateral negotiation between the ministry and a platform's local rep.
- Reserve full-service blocking for genuine systemic non-compliance, not for individual content disputes — and require a published reasoned order before any block.
- Publish enforcement data: how many notices, against which categories, with what compliance rates. The DSA's transparency database is an imperfect but workable model.
Why This Matters Beyond Indonesia
Indonesia is the world's fourth-most-populous country, ASEAN's largest digital economy, and a recurring template for other Southeast Asian regulators. Vietnam, Thailand, Malaysia and the Philippines all watch Jakarta's regulatory moves closely. If MR5-style short-fuse takedowns plus access blocking become the regional default, the cumulative compliance cost — and the cumulative speech cost — will be significant. A proportionate Indonesian model, by contrast, could anchor a more defensible Asian intermediary-liability paradigm.
Komdigi is right to push platforms harder on child safety and clearly criminal content. But the tools it is using were built for a different era and a different problem. Updating MR5 with proportionality, due process and transparency baked in would not weaken Indonesia's enforcement hand — it would make that hand far more credible, both to the platforms it regulates and to the 200+ million Indonesians whose digital lives now run through them.