In January 2025, the US Supreme Court unanimously upheld the Protecting Americans from Foreign Adversary Controlled Applications Act (PAFACA) in TikTok Inc. v. Garland, ruling that Congress's national-security justification for forcing ByteDance to divest TikTok survived First Amendment scrutiny. Sixteen months later, the statute remains on the books, the divestiture has not happened, and the ban has not been enforced. President Trump has used a sequence of executive orders through 2025 to push the deadline back, instructing the Department of Justice not to take action against app stores or hosting providers that continue to carry TikTok.
Whatever one thinks of PAFACA on the merits — and we have long argued it is a blunt instrument — the current state of play is the worst of all possible worlds. A duly-enacted federal statute, validated by a 9-0 Supreme Court, is being effectively suspended by executive non-enforcement. That is a precedent the US tech policy community should be deeply uncomfortable with, regardless of which app is at stake.
What PAFACA Actually Requires
PAFACA, signed into law in April 2024 as part of a broader foreign-aid package, gave ByteDance roughly 270 days to divest TikTok's US operations to a non-"foreign adversary controlled" buyer. Failure to divest would trigger civil penalties of up to $5,000 per US user against any "entity" — read: Apple, Google, Oracle, Akamai — that continued to distribute or host the app. The statutory deadline was January 19, 2025.
The Supreme Court's January 17 ruling found that Congress had articulated a sufficient national-security interest in preventing a designated foreign adversary from accessing the data of 170 million Americans and shaping the content they consume. The Court did not find that TikTok had done anything wrong — it found that the structural risk Congress identified was real enough to survive intermediate scrutiny.
The Non-Enforcement Saga
On January 20, 2025, hours after taking office, President Trump signed an executive order directing the Attorney General not to enforce PAFACA for 75 days while the administration explored a deal. Through 2025 and into 2026, that 75-day window has been extended several times via further executive orders, each accompanied by claims that a transaction is close. No closed divestiture has been announced; reporting indicates that the structural questions — who owns the recommendation algorithm, what "qualified divestiture" means, whether ByteDance retains any operational role — remain unresolved.
Meanwhile, Apple, Google, and Oracle have continued to distribute and host the app, relying on letters from the Justice Department promising non-enforcement. Those letters are not binding on future administrations and do not waive the $5,000-per-user statutory penalty, which could theoretically be invoked retroactively. The exposure is enormous and the legal basis for the comfort letters is, at best, contested.
Why This Matters Beyond TikTok
We have three concerns, none of which depend on a view about TikTok itself.
First, selective non-enforcement is a rule-of-law problem. Article II's "take care" clause obliges the executive to faithfully execute the laws. There is room for prosecutorial discretion at the margins, but indefinite, categorical non-enforcement of a statute that survived the highest court in the land tests that doctrine to its breaking point. If this is acceptable for PAFACA, it is acceptable for the next administration to decline to enforce any law it dislikes.
Second, the uncertainty is itself a tax on innovation. US-listed platforms, ad-tech firms, and the entire creator economy built atop TikTok are operating under a Damoclean sword. Investment decisions, ad commitments, and creator contracts are being made against a backdrop in which the legal status of the underlying app could flip overnight with a change of administration or a single court ruling. That uncertainty is a deadweight loss; clarity — even unwelcome clarity — would be better.
Third, app-store and hosting-provider mandates are a dangerous regulatory pattern. PAFACA outsourced enforcement to intermediaries by threatening Apple, Google, Oracle and Akamai with ruinous penalties. That model is now being copied elsewhere — in age-verification statutes, in proposed AI-model registries, in foreign-adversary software bills. Each time the federal government uses chokepoints to regulate speech-carrying platforms, it lowers the cost of doing so the next time.
A Proportionate Path Forward
The pro-innovation answer is not "do nothing about foreign-adversary data access." It is to legislate the underlying concern directly, rather than through an app-specific ban. Congress could:
- Pass a comprehensive federal privacy law (the long-stalled APRA framework is a starting point) that restricts cross-border transfers of sensitive personal data regardless of the platform.
- Establish technology-neutral data-localisation and audit requirements for any platform above a user threshold, applied to all foreign-controlled entities equally.
- Require transparency about recommendation-system design and foreign-government content requests, drawing on the EU's Digital Services Act risk-assessment template — but with narrower scope.
Any of these would address the legitimate national-security concern that animated PAFACA without singling out one company, without conscripting app stores as enforcement arms, and without leaving the US executive branch in the position of openly ignoring its own laws. Sixteen months of executive deferral is not a policy. It is a vacuum — and the longer it persists, the more damage it does to the credibility of American tech regulation.