TikTok Inspired Child Suicide Prompts a Sound Reading of Section 230

Section 230

Last week, the Third Circuit Court of Appeals issued an opinion regarding Section 230 of the Communications Decency Act. It may be the strongest affirmation to date that the statute does not provide a blanket liability shield for all social platforms regardless of their conduct. Specifically, §230(c)(1) only immunizes platforms for liability that may arise from other parties’ speech, not from the platform’s own speech. And although the platforms have sought to argue that their “recommendation” algorithms, which push content to users, do not constitute speech, the courts aren’t buying it.

In the case Anderson v. TikTok, the appeals court reversed the lower court finding that the platform was automatically immunized against a liability claim involving the death of a child who attempted one of the many dangerous “challenges” that appear on social media. In this case, Nylah Anderson, age 10, died by accidentally hanging herself when she tried the “Blackout Challenge,” which dared people to asphyxiate themselves until they passed out. At issue for TikTok is not the challenge itself, started by an unknown third-party, but the “For You Page” algorithm which “recommended” the challenge to Anderson. Judge Matey, in a strident concurrence with the circuit court opinion, writes the following:

TikTok reads § 230…to permit casual indifference to the death of a ten-year-old girl. It is a position that has become popular among a host of purveyors of pornography, self-mutilation, and exploitation, one that smuggles constitutional conceptions of a “free trade in ideas” into a digital “cauldron of illicit loves” that leap and boil with no oversight, no accountability, no remedy.

Though the reference to St. Augustine implies a religious moralizing I might omit, Judge Matey’s accusation that social platforms host a “cauldron” of dangerous, illegal, and depraved material behind a veil of social good and constitutional rhetoric is indisputable. As a legal matter, had Anderson discovered the video challenge (e.g., via search), TikTok would likely be immunized by §230, but because a “recommendation” algorithm factored in the child’s conduct resulting in her death, this is an important distinction that could more clearly articulate a shift in judicial review of the statute and, we should hope, an overdue change in platform governance.

As Judge Matey further states in his concurrence, TikTok’s presumed immunity under §230 in this case is “…a view that has found support in a surprising number of judicial opinions dating from the early days of dial-up to the modern era of algorithms, advertising, and apps.” That view is properly dimming now, and by my reckoning, the Supreme Court will go where the Third Circuit went last week. In a pair of nearly identical cases, Gonzalez v. Google and Twitter v. Taamneh (2022), the plaintiffs, on behalf of victims of two ISIS-related terror attacks, sought to hold the platforms accountable for “recommending” ISIS recruiting videos. But because those claims relied substantially on meeting the standard for “aiding and abetting” under criminal law, the Court found little plausible claim for relief and, therefore, declined to address the question of §230 immunity.

But if Anderson (or a similar case) goes to the Supreme Court, I believe the justices will have little difficulty finding that a “recommendation” algorithm promoting a video challenge that led to a child’s death is a foundation for a liability case to proceed. As the Court stated in Taamneh, “When there is a direct nexus between the defendant’s acts and the tort, courts may more easily infer such culpable assistance.” In Anderson, with no other party acting as the direct cause of the child’s death, the facts are even simpler, revealing a clear nexus between the video challenge “recommended” by the platform and the accidental suicide. Further, this July, the Court held in the unanimous Moody v. NetChoice decision that social platforms “shape other parties’ expression into their own curated speech products.”[1] Under that rule, the Third Circuit finds that TikTok’s “recommendation” of the Blackout Challenge to Niyah Anderson plausibly constitutes the platform’s own speech, for which it may be held liable.

The reason I keep putting “recommended” in quotes is that at the time SCOTUS granted cert in the Taamneh and Gonzalez cases, I wrote a post opining that the courts, policymakers, et al. should take a jaundiced view of this too friendly term to describe an insidious function of social media. It is no longer controversial to say that platform operators manipulate what users see and hear, or that this manipulation can lead to disastrous results from disinformation campaigns in the political arena to drug-related deaths to suicide by little girls.

It is a familiar refrain that it takes a tragedy, or many tragedies, to change policy, and with the story of Nylah Anderson, and the many young victims she represents, we may finally see Big Tech’s hypocrisy on speech collapse under the weight of its own absurdity. The major platforms have played games with the First Amendment and §230 for nearly 20 years—conflating their business interests with users’ speech rights or asserting their own speech rights when necessary or asserting that nothing they do is their own speech—all depending on which potential liability the company seeks to avoid. Further, that confusion has not been helped in recent years by certain politicians who misstate the operation of the speech right to create political theater around allegations of bias.

Out of all that mess, it is notable that Justice Thomas, since at least 2020,[2] has restated the observation that online platforms will avail themselves of constitutional protection to engage in conduct like algorithmic “recommendation” but then invert the argument to shroud itself in the §230 shield. And then, the courts will stop a liability claim from even proceeding. As Congress, the Supreme Court, and now the Third Circuit have all reiterated, no industry in the country enjoys that kind of immunity, and perhaps this claim against TikTok will be the case that finally ends this unfounded and unreasonable privilege for online platforms.


[1] On a side note, this is reminiscent of the “selection and arrangement” doctrine in copyright law, which finds “expression” in the choices made by the author who engages in that conduct. All copyrightable expression is a form of speech.

[2] See dissent on the grant of certiorari in Malwarebytes v. Enigma.

Photo by: 

David Newhoff
David is an author, communications professional, and copyright advocate. After more than 20 years providing creative services and consulting in corporate communications, he shifted his attention to law and policy, beginning with advocacy of copyright and the value of creative professionals to America’s economy, core principles, and culture.

Enjoy this blog? Please spread the word :)