SHIELD Act Passes in the Senate

SHIELD

It’s been nearly ten years since I first heard the term “revenge porn” and wrote a speculative post inspired by then Rep. Jackie Speier’s bill to make the act a federal crime. Much has transpired since then, including the obsolescence of the term “revenge porn” and the progress of generative artificial intelligence (GAI), which has already changed the nature of nonconsensual pornography. Legislation is in the works to address GAI used for this purpose, but in the meantime, the Senate on Wednesday finally passed the bill known as the Stopping Harmful Image Exploitation and Limiting Distribution, or SHIELD Act.

If SHIELD becomes law, the conduct of distributing intimate images without permission will be a federal crime with penalties that include fines and prison sentences. This is a game-changer, both pragmatically and culturally—fostering equitable remedies for victims and reasonable deterrents to at least some who might engage in the conduct. Further it signals a more mature relationship to digital life, leaving behind the rhetoric and handwringing that new liabilities for new harms conducted through online platforms will lead to rampant censorship of protected speech.

A decade ago, the phenomenon called “revenge porn” was still relatively new, and there was little general understanding about its potential for causing harm—or why the term itself was a misnomer. Initially, the “revenge” part referred to mostly men lashing out at ex-girlfriends or ex-wives by disclosing intimate images which had originally been shared in private. Distribution included web platforms that solicit and display “revenge porn” where the perpetrator could find a virtual fraternity of anger bros adding degrading, threatening, and rape-themed comments to the unlawfully displayed images. But the term was problematic from a legal standpoint.

Thanks substantially to the work of Dr. Mary Anne Franks and Danielle Keats-Citron, in their capacities as legal scholars and leaders of the Cyber Civil Rights Initiative, legislation at the state and federal level is focused on the act of nonconsensual disclosure, and not the motive per se. Because the motives for disclosing intimate images vary from immature “kicks” to sextortion, it was essential that the cause of action should not be limited solely to an intent to cause harm

SHIELD criminalizes nonconsensual disclosure, either with an intent to cause harm or if harm is caused unintentionally. This includes “…psychological, financial, or reputational harm, to the individual depicted.” As I say, a lot has changed over the last decade, and sadly, there is now a preponderance of evidence that nonconsensual distribution of intimate imagery (NDII) causes a spectrum of harmful results, including professional opportunity and relationship loss, psychological trauma, harassment, threats, physical violence, and suicide. In fact, Cyber Civil Rights Initiative has recently adopted the term Image-Based Sexual Abuse (IBSA) to properly frame the nature of so-called “revenge porn.”

A decade ago, legislation like Rep. Speier’s was met with the predictable criticism that it would sweep too broadly, cause undue censorship online and chill the speech right. In fact, anti-IBSA legislation survived First Amendment challenges in five of the now 49 states that have such laws. In 2022, when the Indiana State Supreme Court upheld that state’s law, Dr. Franks stated, “Indiana is the fifth state supreme court to uphold the constitutionality of criminal prohibitions of image-based sexual abuse. It should now be completely clear that there is no First Amendment right to disclose private, sexually explicit images of another person without consent.”

Since 2015, the theory that these laws were unconstitutional violations of the speech right has not only been tested at the state level, but the fervent belief that everything online is protected speech has waned considerably. Mitigating harm online, especially anything involving sexual abuse and minors, is one of the few subjects of bipartisan agreement these days. The fact that SHIELD passed the Senate this month suggests to me that it will become law by the end of the year. It will be an essential step in protecting the mostly women and girls who are targeted for IBSA.


Image source:

Professor Citron Proposes Civil Remedies for Violations of Intimate Privacy

At a panel hosted by The Reykjavik Dialogue,[1] during a discussion about law enforcement, justice, and sex discrimination, Mary Anne Franks, co-founder of the Cyber Civil Rights Initiative noted that when her organization asked perpetrators who had engaged in revenge porn what would have stopped them from doing it, the answer was almost universally, “If I thought I could go to jail for it.”

The act of distributing intimate, private images via the internet without permission of the persons depicted is a crime—one that causes ongoing harm to victims, including harassment and violence, destruction of interpersonal relationships, loss of employment opportunities, psychological disorder, and suicide. And thanks substantially to the efforts of Franks and her colleague at CCRI, Danielle Keats Citron, nearly every state has criminalized the act of nonconsensual distribution of intimate images; and a federal bill to do likewise, the SHIELD Act, passed the House in March as part of H.R. 1620.

But while these laws pave the way for prosecution of the individuals who engage in this conduct, they do nothing about removing these violations of intimate privacy from the websites hosing the material. And to make matters more complicated, “deepfakes” technologies make it relatively easy to depict just about anyone in intimate or sexually explicit material for which they were never actually photographed.

Citron Proposes Privacy Injunctions

In a new paper that addresses the nonconsensual distribution of both real and manufactured images, Citron proposes two interdependent legal mechanisms to overcome the hurdles to removing this kind of content from the internet, and she also discusses the First Amendment questions raised as both a constitutional and a cultural matter.

First, Citron argues that courts must be empowered with “clear legislative permission” to provide plaintiffs with injunctive relief by ordering sites “to remove, delete, or otherwise make unavailable intimate images, real or fake, hosted without written permission.” One might think this is common sense, or simply a matter of basic decency, but court orders to remove material of any kind have been assiduously opposed by internet platforms large and small, and with considerable legal and PR support from “digital rights” activists like the Electronic Frontier Foundation. (See post here and here about Google v. Equustek & Hassell v. Bird.)

The rationale usually argued in the blogosphere and the courts for refusal to remove any content is the First Amendment—a fallacy that now roils the public debate—but the legal foundation that has given the platforms the swagger to distort the speech and press rights has been the courts’ over-broad interpretation of Section 230 of the Communications Decency Act as a blanket immunity. Not only have platforms been shielded against being named parties to civil litigation, but 230 has been invoked as the reason to shield them even from injunctions that do nothing more than order the removal of harmful material. Naturally, when a web company cannot be held liable for anything, it’s very easy for its operators to call all content “speech” and tell the public that all platforms are inherently engines of free expression.

Thus, in order for the above-mentioned legislative permission to be effective, Citron argues, as she and Franks have in earlier papers, that, “Congress should amend Section 230 to make clear that platforms and search engines can be sued in cases seeking injunctive relief and attorney’s fees related to the removal of intimate images hosted without written consent.”

Citron acknowledges that the solution is not perfect, particularly because litigation directed at one incident on one platform does not address the likelihood that intimate images will be distributed across multiple sites; but she writes, “Victims need to know that society recognizes the damage to the dignity and intimate privacy of victims, that law can help mitigate the damage, that sites are not law-free zones, and that lawyers will represent them.”

If that sounds like Citron’s proposed remedies are more symbolic than remedial, I will echo her comparison to civil rights legislation and argue that we should not underestimate even the symbolism of law to effect widespread remedies by fostering cultural and behavioral change. Presumably, most people do believe the act of distributing intimate images without permission is wrong, whether for revenge or any other motive. So, it helps when the law says it’s wrong, too. But at the same time, Citron addresses a broader cultural phenomenon in which Americans in particular struggle with our brand of the speech right and the distinction between access to information and prurient curiosity.

As a constitutional question, when a law intersects rights like those enumerated in the First Amendment, it must be held to the standard known as strict scrutiny. This means that a statute must serve a compelling public interest and must achieve a narrow purpose that cannot be achieved through less restrictive means. Here, Citron notes that the state laws criminalizing the nonconsensual distribution of intimate images have already held up to constitutional challenges in Vermont, Illinois, and Minnesota, but she also discusses that gray area where the public’s right to know is often too easily conflated with general interest.”By my lights, there can be a vast difference between learning about a public official’s intimate information and seeing photographs or videos documenting it. That distinction is worth careful consideration,” Citron writes.

Agreed. Specifically, did the American public have a right to know that Rep. Katie Hill was intimately involved with a member of her staff and, allegedly, using marijuana? Yes. Even though I personally do not care much what an elected official does in her private life unless it directly intersects with the official role, those allegations are certainly news that voters have a right to know. But I agree with Citron that there is a moral line—I would say a chasm—between a news report about Hill’s conduct and the publication of her intimate images (albeit semi-redacted) on the site RedState.

Hill sued RedState owner Salem Media,[2] and the publisher was granted a motion to dismiss the complaint under California’s anti-SLAPP law,[3] with the court finding, in Citron’s words, that “the photos shed light on Hill’s fitness for office.” The hell they did. How the information about Hill’s conduct sheds light on her fitness for office is up to the voters, but the leaked photos were nothing more than RedState’s opportunity to earn revenue by pandering to the worst impulses of the electorate, which increasingly cannot distinguish between political discourse and tribal brutality. RedState’s publication of the photos is barely distinguishable from revenge porn disguised as political reportage.[4] And to add insult to injury, Hill had to pay $200,000 for Salem’s legal fees.

As Citron notes, “Most cases involving the nonconsensual disclosure of intimate images will not present close calls about the boundaries of the public’s legitimate interest.” And, of course, this is correct. Most individuals who engage in this kind of behavior are not even propaganda mongers, let alone journalists. But I do suspect the techbro culture of the internet, where perhaps the blurry lines we see on a RedState re. Hill or a Gawker re. Hulk Hogan, imply to those other bros who violate intimate privacy that what they are doing is not criminal. It is. And it is time for the laws to catch up to that reality.


[1] Renewing Activism to End Violence Against Women www.rekjavikdialogue.is

[2] Hill’s counsel is Carrie Goldberg, leading specialist in this area.

[3] Strategic Lawsuit Against Public Participation.

[4] To be clear, I would say the same thing about the publication of similar photos of Reps. Boebert or Greene for whom I have nothing but contempt.

Is a Revenge Porn Bill Next?

When nude photos of celebrities were leaked and distributed all over the internet in 2014, Jennifer Lawrence, as one of the victims, called it a “sex crime.” Meanwhile, the idea that the platforms themselves bore much responsibility to remove the image was met with mixed responses. The leadership at Reddit was so high on the fumes of its own utopian bullshit that they compared governance of the site to that of a democratic nation which should not impose moral choices on its citizens. Into that bro-publica climate, Representative Jackie Speier (D-CA) introduced a bill in July 2016 that would make “revenge porn” a federal crime. The usual defenders of the web raised the same red flags, asserting that even a well-intended bill of this nature would lead to over-censorship online. Then, little was heard about this proposal, except perhaps inside the Beltway.

But suddenly, the landscape is very different, and I would not be surprised if we see movement on some type of “revenge porn” bill in 2018. In light of the head-spinning litany of sexual-assault allegations in the news, the general dilution of Silicon Valley’s political clout, and what seems like the inevitable passage of the SESTA bill, Rep. Speier’s bill might make relatively smooth progress toward ratification next year. If nothing else, it’s easy to imagine Congress passing this kind of legislation in a scramble to get on the right side of the historic shift we’re witnessing with regard to sexual harassment in every context.

Meanwhile, you might have missed the news that Facebook’s recently proposed an internal “solution” to combat revenge-porn, which was appropriately scorned, if not outright mocked, because it requires trusting their “trained” team with your intimate photos so they can protect you. These are the same guys who couldn’t do the math on Russians buying American political ads with rubles. Activist and author Violet Blue wrote a great piece for Endgadget describing why Facebook’s counter-revenge-porn proposal is not wearing any clothes. “The process presumes the victim has these photos in the first place, and cavalierly ignores that this person is living in a nightmarish hellscape trauma that is in no way re-experienced by handing the instrument of their terror to an anonymous, unaccountable, possibly grey alien Facebook employee,” she writes.

The Speier Bill

It’s actually a misnomer to call H.R. 5896 a “revenge porn” bill because revenge porn is a specific act, usually perpetrated by angry ex-boyfriends who get back at women who’ve broken up with them by distributing nude or sexually-explicit imagery they might have made together as a couple. Speier’s bill, titled the “Intimate Privacy Protection Act,” bypasses the issue of motive altogether and merely states that anyone who distributes intimate images—the language defines these explicitly—of adults with “reckless disregard for the lack of consent” of the subject could potentially face federal charges.

Often, the harm does not end with mere embarrassment. Instead, the images may serve as the predicate for a sustained, emotional assault by a male cyber-mob hounding a female victim, labeling her a “slut,” “bitch,” “whore,” and so on. Cites that trade in unauthorized intimate images may extort payments from victims for removal of their images, but there is little to stop the images from migrating virally once online. As such, remedies for removal are nearly impossible, and any effort on the part of the victim to extricate herself from the “hellscape,” as Violet Blue puts it, is more likely to exacerbate the emotional trauma than to ameliorate it.

As an aside, yes, every kind of sex education in the world ought to include a segment on the hazards of making intimate images with networked devices. It’s hard to believe that anyone is still naive enough to think that images created on smart phones, etc. can be kept private without substantial risk. But that kind of personal awareness does not preclude criminalizing the decision by an individual or entity to distribute these images without permission.

Thirty-eight states plus the District of Columbia have some type of law criminalizing “revenge porn,” but given the geographical irrelevance of internet distribution, it seems only reasonable to proscribe the conduct as part of the federal criminal code. Assuming this bill does see any action in 2018, we can expect the usual suspects—EFF, PublicKnowledge, Techdirt, et al—to cry havoc and declare once again the danger that such proposals pose to free speech on the internet.

Whether this chorus will be joined by major platforms like Google, Facebook, and Twitter may not be as predictable as it would have been just a year ago. I suspect these companies are all recalibrating how to spend their political capital now that public sentiment is less inclined to give them carte blanche; and distributing intimate images without permission is not a “cause” most people are going to support. Regardless, when it comes to the various harms that can be caused via cyberspace, it seems the public is catching on to two realities: 1) that an internet policy doctrine based on the natural goodness of people is utter folly; and 2) the tech companies are in way over their heads.