With the signing of PL 2628/2022 — the so-called ECA Digital, or Digital Statute of the Child and Adolescent — Brazil has placed the second cornerstone of a sweeping reconstruction of its intermediary-liability regime. Coming on the heels of the Supreme Federal Court's (STF) reinterpretation of Article 19 of the Marco Civil da Internet, the law marks Brazil's decisive step away from the pure judicial-takedown model that has defined the country's internet governance since 2014, and toward a duty-of-care architecture closer to the United Kingdom's Online Safety Act and the European Union's Digital Services Act.
For Meta, Google/YouTube, TikTok, Kwai, Roblox, X and a long tail of smaller platforms, the practical consequence is the same: Brazil is no longer a notice-and-takedown jurisdiction for content involving minors. It is now a systems-and-processes jurisdiction.
What the law actually does
The ECA Digital introduces statutory obligations that operate ex ante rather than ex post. The headline duties include:
- Age assurance at the point of access for products likely to be used by children, with proportionate verification depending on the risk profile of the service.
- Design-by-default protections: the highest privacy and safety settings must be the default for accounts known or likely to belong to minors.
- Parental control tools made available natively, not as third-party add-ons.
- Transparency reporting on content moderation, recommender behaviour, and complaints affecting minors.
- Direct liability for systemic failure to act against content that is harmful to children — distinct from item-by-item judicial orders.
This duty-of-care logic is structurally similar to the obligations imposed on Very Large Online Platforms under Articles 28 and 35 of the EU DSA, and to the children's safety duties codified in Sections 11–13 of the UK Online Safety Act 2023.
Why it matters: the Article 19 context
Brazil's Article 19 has long served a function analogous to the United States' Section 230 — a strong intermediary shield, conditioning civil liability on the existence of a specific court order. The STF's recent re-reading of Article 19 carved out broad categories where platforms can be held liable without a prior judicial decision, including content involving serious crimes against children. The ECA Digital codifies and operationalises that shift: where the STF created a principle, Congress has now created a process.
For policymakers and platforms alike, this is the more important story. The duty-of-care model is no longer a European peculiarity — it is becoming the global default for large-scale online services, with Brazil joining the UK, the EU, Australia, and (in narrower form) several US states.
The pro-innovation case for getting this right
People of Internet has consistently argued that protecting children online is a legitimate state interest, and that platforms have real obligations to address the unique risks faced by younger users. The question is never whether to act, but how.
Three principles should guide the regulatory implementation of ECA Digital:
1. Proportionality by service size and risk
The DSA's tiered approach — distinguishing VLOPs from smaller services — exists for good reason. Imposing identical age-assurance and transparency burdens on a Brazilian edtech start-up and on TikTok would entrench incumbents and choke domestic innovation. Brazil's implementing regulations should mirror the DSA's risk-based tiering.
2. Age assurance without mass identity collection
The UK Information Commissioner's Office and Ofcom have repeatedly warned that age verification done badly creates a new privacy harm in the name of solving an old safety harm. Methods such as on-device age estimation, tokenised attestations, and privacy-preserving age signals should be explicitly permitted. A regime that effectively requires every Brazilian to upload an ID to every platform would be a regulatory failure dressed as a child-protection win.
3. Clarity on "harmful content"
The greatest legal risk in any duty-of-care regime is over-removal — the rational platform response to vague liability is to delete first and ask questions later. Brazil's regulator must define harmful content with sufficient precision that platforms can build systems against it, and must protect lawful speech, including political and educational content involving young people.
What to watch next
The text of the law is only the opening move. The substantive obligations of the ECA Digital will be shaped almost entirely by the secondary regulation produced by the executive branch and by Brazil's data protection authority (ANPD).
Three open questions will determine whether ECA Digital becomes a model for the Global South or a cautionary tale:
- Will the implementing decree adopt risk-based tiering or apply flat obligations across all platforms?
- Will age-assurance rules permit privacy-preserving methods, or default to government-ID checks?
- How will the ECA Digital interact with the STF's Article 19 framework — will the two regimes converge into a coherent duty-of-care system, or generate parallel and conflicting liability paths?
Brazil has a genuine opportunity here. The country has the largest internet user base in Latin America, world-class civil society expertise from the Marco Civil era, and a regulator (ANPD) with credible technical capacity. If implementing rules are proportionate, technology-neutral, and clear, the ECA Digital could become a template for child-safety regulation that works for both users and innovators. If they are not, it risks becoming yet another well-intentioned statute that consolidates the biggest platforms while suppressing the next generation of Brazilian services.
The law has passed. The harder work — and the more consequential one — is about to begin.