At a panel hosted by The Reykjavik Dialogue, during a discussion about law enforcement, justice, and sex discrimination, Mary Anne Franks, co-founder of the Cyber Civil Rights Initiative noted that when her organization asked perpetrators who had engaged in revenge porn what would have stopped them from doing it, the answer was almost universally, “If I thought I could go to jail for it.”
The act of distributing intimate, private images via the internet without permission of the persons depicted is a crime—one that causes ongoing harm to victims, including harassment and violence, destruction of interpersonal relationships, loss of employment opportunities, psychological disorder, and suicide. And thanks substantially to the efforts of Franks and her colleague at CCRI, Danielle Keats Citron, nearly every state has criminalized the act of nonconsensual distribution of intimate images; and a federal bill to do likewise, the SHIELD Act, passed the House in March as part of H.R. 1620.
But while these laws pave the way for prosecution of the individuals who engage in this conduct, they do nothing about removing these violations of intimate privacy from the websites hosing the material. And to make matters more complicated, “deepfakes” technologies make it relatively easy to depict just about anyone in intimate or sexually explicit material for which they were never actually photographed.
Citron Proposes Privacy Injunctions
In a new paper that addresses the nonconsensual distribution of both real and manufactured images, Citron proposes two interdependent legal mechanisms to overcome the hurdles to removing this kind of content from the internet, and she also discusses the First Amendment questions raised as both a constitutional and a cultural matter.
First, Citron argues that courts must be empowered with “clear legislative permission” to provide plaintiffs with injunctive relief by ordering sites “to remove, delete, or otherwise make unavailable intimate images, real or fake, hosted without written permission.” One might think this is common sense, or simply a matter of basic decency, but court orders to remove material of any kind have been assiduously opposed by internet platforms large and small, and with considerable legal and PR support from “digital rights” activists like the Electronic Frontier Foundation. (See post here and here about Google v. Equustek & Hassell v. Bird.)
The rationale usually argued in the blogosphere and the courts for refusal to remove any content is the First Amendment—a fallacy that now roils the public debate—but the legal foundation that has given the platforms the swagger to distort the speech and press rights has been the courts’ over-broad interpretation of Section 230 of the Communications Decency Act as a blanket immunity. Not only have platforms been shielded against being named parties to civil litigation, but 230 has been invoked as the reason to shield them even from injunctions that do nothing more than order the removal of harmful material. Naturally, when web company cannot be held liable for anything, it’s very easy for its operators to call all content “speech” and tell the public that all platforms are inherently engines of free expression.
Thus, in order for the above-mentioned legislative permission to be effective, Citron argues, as she and Franks have in earlier papers, that, “Congress should amend Section 230 to make clear that platforms and search engines can be sued in cases seeking injunctive relief and attorney’s fees related to the removal of intimate images hosted without written consent.”
Citron acknowledges that the solution is not perfect, particularly because litigation directed at one incident on one platform does not address the likelihood that intimate images will be distributed across multiple sites; but she writes, “Victims need to know that society recognizes the damage to the dignity and intimate privacy of victims, that law can help mitigate the damage, that sites are not law-free zones, and that lawyers will represent them.”
If that sounds like Citron’s proposed remedies are more symbolic than remedial, I will echo her comparison to civil rights legislation and argue that we should not underestimate even the symbolism of law to effect widespread remedies by fostering cultural and behavioral change. Presumably, most people do believe the act of distributing intimate images without permission is wrong, whether for revenge or any other motive. So, it helps when the law says it’s wrong, too. But at the same time, Citron addresses a broader cultural phenomenon in which Americans in particular struggle with our brand of the speech right and the distinction between access to information and prurient curiosity.
As a constitutional question, when a law intersects rights like those enumerated in the First Amendment, it must be held to the standard known as strict scrutiny. This means that a statute must serve a compelling public interest and must achieve a narrow purpose that cannot be achieved through less restrictive means. Here, Citron notes that the state laws criminalizing the nonconsensual distribution of intimate images have already held up to constitutional challenges in Vermont, Illinois, and Minnesota, but she also discusses that gray area where the public’s right to know is often too easily conflated with general interest.”By my lights, there can be a vast difference between learning about a public official’s intimate information and seeing photographs or videos documenting it. That distinction is worth careful consideration,” Citron writes.
Agreed. Specifically, did the American public have a right to know that Rep. Katie Hill was intimately involved with a member of her staff and, allegedly, using marijuana? Yes. Even though I personally do not care much what an elected official does in her private life, unless it directly intersects with the official role, those allegations are certainly news that voters have a right to know. But I agree with Citron that there is a moral line—I would say a chasm—between a news report about Hill’s conduct and the publication of her intimate images (albeit semi-redacted) on the site RedState.
Hill sued RedState owner Salem Media, and the publisher was granted a motion to dismiss the complaint under California’s anti-SLAPP law, with the court finding, in Citron’s words, that “the photos shed light on Hill’s fitness for office.” The hell they did. How the information about Hill’s conduct sheds light on her fitness for office is up to the voters, but the leaked photos were nothing more than RedState’s opportunity to earn revenue by pandering to the worst impulses of the electorate, which increasingly cannot distinguish between political discourse and tribal brutality. RedState’s publication of the photos is barely distinguishable from revenge porn disguised as political reportage. And to add insult to injury, Hill had to pay $200,000 for Salem’s legal fees.
As Citron notes, “Most cases involving the nonconsensual disclosure of intimate images will not present close calls about the boundaries of the public’s legitimate interest.” And, of course, this is correct. Most individuals who engage in this kind of behavior are not even propaganda mongers, let alone journalists. But I do suspect the techbro culture of the internet, where perhaps the blurry lines we see on a RedState re. Hill or a Gawker re. Hulk Hogan, imply to those other bros who violate intimate privacy that what they are doing is not criminal. It is. And it is time for the laws to catch up to that reality.
 Strategic Lawsuit Against Public Participation.
 To be clear, I would say the same thing about the publication of similar photos of Reps. Boebert or Greene for whom I have nothing but contempt.