Banning TikTok

Two days from now, TikTok may go dark in the US. Nine months ago, in April 2024, Congress passed the Protecting Americans from Foreign Adversary Controlled Applications Act, banning TikTok if its Chinese owner, ByteDance, has not removed itself from ownership by January 19, 2025.

Last Friday, January 10, the US Supreme Court heard three hours of arguments in consolidated challenges filed by TikTok and a group of TikTok users: TikTok, Inc. v. Garland and Furbaugh v. Garland. Too late?

As a precaution, Kinling Lo and Viola Zhou report at Rest of World, at least some of TikTok’s 170 million American users are building community arks elsewhere – the platform Xiaohongshu (“RedNote”), for one. This is not the smartest choice; it, too is Chinese and could become subject to the same national security concerns, like the other Chinese apps Makena Kelly reports at Wired are scooping up new US users. Ashley Belanger reports at Ars Technica that rumors say the Chinese are thinking of segregating these American newcomers.

“The Internet interprets censorship as damage, and routes around it,” EFF founder and activist John Gilmore told Time Magazine in 1993. He meant Usenet, which could get messages past individual server bans, but it’s really more a statement about Internet *users*, who will rebel against bans. That for sure has not changed despite the more concentrated control of the app ecosystem. People will access each other by any means necessary. Even *voice* calls.

PAFACA bans apps from four “foreign adversaries to the United States” – China, Russia, North Korea, and Iran. That being the case, Xiaohongshu/RedNote is not a safe haven. The law just hasn’t noticed this hitherto unknown platform yet.

The law’s passage in April 2024 was followed in early May by TikTok’s legal challenge. Because of the imminent sell-by deadline, the case was fast-tracked, and landed in the US District of Columbia Circuit Court of Appeals in early December. The district court upheld the law and rejected both TikTok’s constitutional challenage and its request for an injunction staying enforcement until the constitutional claims could be fully reviewed by the Supreme Court. TikTok appealed that decision, and so last week here we were. This case is separate from Free Speech Coalition v. Paxton, which SCOTUS heard *this* week and challenges Texas’s 2023 age verification law (H.B. 1181), which could have even further-reaching Internet effects.

Here it gets silly. Incoming president Donald Trump, who originally initiated the ban but was blocked by the courts on constitutional grounds, filed an amicus brief arguing that any ban should be delayed until after he’s taken office on Monday because he can negotiate a deal. NBC News reports that the outgoing Biden administration is *also* trying to stop the ban and, per Sky News, won’t enforce it if it takes effect.

Previously, both guys wanted a ban, but I guess now they’ve noticed that, as Mike Masnick says at Techdirt, it makes them look out of touch to nearly half the US population. In other words, they moved from “Oh my God! The kids are using *TikTok*!” to “Oh, my God! The kids are *using* TikTok!”

The court transcript shows that TikTok’s lawyers made three main arguments. One: TikTok is now incorporated in the US, and the law is “a burden on TikTok’s speech”. Two: PAFACA is content-based, in that it selects types of content to which it applies (user-generated) and ignores others (reviews). Three: the US government has “no valid interest in preventing foreign propaganda”. Therefore, the government could find less restrictive alternatives, such as banning the company from sharing sensitive data. In answer to questions, TikTok’s lawyers claimed that the US’s history of banning foreign ownership of broadcast media is not relevant because it was due to bandwidth scarcity. The government’s lawyers countered with national security: the Chinese government could manipulate TikTok’s content and use the data it collects for espionage.

Again: the Chinese can *buy* piles of US data just like anyone else. TikTok does what Silicon Valley does. Pass data privacy laws!

Experts try to read the court. Amy Howe at SCOTUSblog says the justices seemed divided, but overall likely to issue a swift decision. At This Week in Google and Techdirt, Cathy Gellis says the proceedings, have left her “cautiously optimistic” that the court will not undermine the First Amendment, a feeling seemingly echoed by some of the panel of experts who liveblogged the proceedings.

The US government appears to have tied itself up in knots: SCOTUS may uphold a Congressionally-legislated ban neither old nor new administration now wants, that half the population resents, and that won’t solve the US’s pervasive privacy problems. Lost on most Americans is the irony that the rest of the world has complained for years that under the PATRIOT Act foreign subsidiaries of US companies are required to send data to US intelligence. This is why Max Schrems keeps winning cases under GDPR.

So, to wrap up: the ban doesn’t solve the problem it purports to solve, and it’s not the least restrictive possibility. On the other hand, national security? The only winner may be, as Jason Koebler writes at 404Media, Mark Zuckerberg.

Illustrations: Logo of Douyin, ByteDance’s Chinese version of TikTok.

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon or Bluesky.

Sectioned

Social media seems to be having a late-1990s moment, raising flashbacks to the origins of platform liability and the passage of Section 230 of the Communications Decency Act (1996). It’s worth making clear at the outset: most of the people talking about S230 seem to have little understanding of what it is and does. It allows sites to moderate content without becoming liable for it. It is what enables all those trust and safety teams to implement sites’ restrictions on acceptable use. When someone wants to take an axe to it because there is vile content circulating, they have not understood this.

So, in one case this week a US appeals court is allowing a lawsuit to proceed that seeks to hold TikTok liable for users’ postings of the “blackout challenge”, the idea being to get an adrenaline rush by reviving from near-asphyxiation. Bloomberg reports that at least 20 children have died trying to accomplish this, at least 15 of them age 12 or younger (TikTok, like all social media, is supposed to be off-limits to under-13s). The people suing are the parents of one of those 20, a ten-year-old girl who died attempting the challenge.

The other case is that of Pavel Durov, CEO of the messaging service Telegram, who has been arrested in France as part of a criminal investigation. He has been formally charged with complicity in managing an online platform “in order to enable an illegal transaction in organized group”, and refusal to cooperate with law enforcement authorities and ordered not to leave France, with bail set at €5 million (is that enough to prevent the flight of a billionaire with four passports?).

While there have been many platform liability cases, there are relatively few examples of platform owners and operators being charged. The first was in 1997, back when “online” still had a hyphen; the German general manager of CompuServe, Felix Somm, was arrested in Bavaria on charges of “trafficking in pornography”. That is, German users of Columbus, Ohio-based CompuServe could access pornography and illegal material on the Internet through the service’s gateway. In 1998, Somm was convicted and given a two-year suspended sentence. In 1999 his conviction was overturned on appeal, partly, the judge wrote, because there was no technology at the time that would have enabled CompuServe to block the material.

The only other example I’m aware of came just this week, when an Arizona judge sentenced Michael Lacey, co-founder of the classified ads site Backpage.com, to five years in prison and fined him $3 million for money laundering. He still faces further charges for prostitution facilitation and money laundering; allegedly he profited from a scheme to promote prostitution on his site. Two other previously convicted Backpages executives were also sentenced this week to ten years in prison.

In Durov’s case, the key point appears to be his refusal to follow industry practice with respect to to reporting child sexual abuse material or cooperate with properly executed legal requests for information. You don’t have to be a criminal to want the social medium of your choice to protect your privacy from unwarranted government snooping – but equally, you don’t have to be innocent to be concerned if billionaire CEOs of large technology companies consider themselves above the law. (See also Elon Musk, whose X platform may be tossed out of Brazil right now.)

Some reports on the Durov case have focused on encryption, but the bigger issue appears to be failure to register to use encryption , as Signal has. More important, although Telegram is often talked about as encrypted, it’s really more like other social media, where groups are publicly visible, and only direct one-on-one messages are encrypted. But even then, they’re only encrypted if users opt in. Given that users notoriously tend to stick with default settings, that means that the percentage of users who turn that encryption on is probably tiny. So it’s not clear yet whether France is seeking to hold Durov responsible for the user-generated content on his platform (which S230 would protect in the US), or accusing him of being part of criminal activity relating to his platform (which it wouldn’t).

Returning to the Arizona case, in allowing the lawsuit to go ahead, the appeals court judgment says that S230 has “evolved away from its original intent”, and argues that because TikTok’s algorithm served up the challenge on the child’s “For You” page, the service can be held responsible. At TechDirt, Mike Masnick blasts this reasoning, saying that it overturns numerous other court rulings upholding S230, and uses the same reasoning as the 1995 decision in Stratton Oakmont v. Prodigy. That was the case that led directly to the passage of S230, introduced by then-Congressman Christopher Cox (R-CA) and Senator Ron Wyden (D-OR), who are still alive to answer questions about their intent. Rather than evolving away, we’ve evolved back full circle.

The rise of monopolistic Big Tech has tended to obscure the more important point about S230. As Cory Doctorow writes for EFF, killing S230 would kill the small federated communities (like Mastodon and Discord servers) and web boards that offer alternatives to increasing Big Tech’s pwoer. While S230 doesn’t apply outside the US (some Americans have difficulty understanding that other countries have different laws), its ethos is pervasive and the companies it’s enabled are everywhere. In the end, it’s like democracy: the alternatives are worse.

Illustrations: Drunken parrot in Putney (by Simon Bisson).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. She is a contributing editor for the Plutopia News Network podcast. Follow on Mastodon.