Attorney Carrie Goldberg posted an interesting thread on Twitter last week in response to a September 4 New York Times article written by Taylor Lorenz. Before I go on, if you are unfamiliar with Goldberg’s work, her Brooklyn law firm defends victims of online harassment, and she has deservedly become one of the nation’s thought leaders on the subject of internet platform responsibility and liability. As I said in my praise for her book Nobody’s Victim, if anyone is going to help sensibly reform the over-broad application of Section 230 to the benefit of its unintended victims, it will be women like Carrie. But for the purposes of this post, I will stay away from legal liability and stick to questions of social conduct.
The headline of Lorenz’s article poses what Goldberg called the “million-dollar question”: Are Influencers Responsible for the Behavior of Their Followers? The article focuses on a 17-year-old TikToker named Chris (@Donelij) who had been making what users of that platform call “reaction” videos. These are created with the app’s “Duet” function, which enables a creator to capture himself in a split-screen alongside a video posted by any other TikToker. In Chris’s case, he “duets” with videos that reveal some shocking or surprising twist so that his viewers get to watch his reaction when he shifts from approving smile to bemused deadpan. This recurring schtick earned @Donelij over two million followers.
But because Chris performed his deadpan bit in response to a number of transgender TikTokers, whose videos reveal some variation on a surprising gender switch, Chris’s videos apparently inspired some of his viewers to antagonize or threaten the LGBTQ TikTokers featured in the Duets. “Even though he never says a word,” writes Lorenz, “thousands of people have called [his videos] out for being homophobic. Young gay and trans TikTokers who have been featured in Chris’s reactions report they have suffered vicious harassment from commenters and in messages.”
Lorenz reports that, last week, both of Chris’s accounts were banned by TikTok due to “multiple community guideline violations.” Assuming the reaction videos and their associated fallout with the LGBTQ TikTokers was the reason for these terminations, Lorenz poses her thesis question about responsibility. To this, Goldberg offered a tweet thread describing her views regarding both the moral and legal responsibilities of creators, followers, and platforms, placing most of the burden, when harm is done, on platforms.
So, responsibility: there’s legal responsibility and there’s moral responsibility.
And not just the creator & the platform. Also, there’s the responsibility of audience/users too!
— Carrie A. Goldberg (@cagoldberglaw) September 4, 2020
In general, I agree with Carrie. Many of us would like to reverse, or at least reassess, the longstanding assumption that online platforms are strictly neutral conduits, and that users are the only responsible parties for anything. At the same time, however, this particular example strikes me as difficult to reconcile with regard to questions of responsibility, though it does point to the fact that our frame of reference (i.e. old media) falls short when grappling with the true nature of social platforms. Let’s step back for a moment.
All speakers bear some measure of responsibility for the consequences of their speech. While I do not think the contours of that responsibility are easily generalized, it is morally incumbent upon any speaker to at least acknowledge that speech has the power to inspire conduct and to at least contemplate what that conduct might be. Regardless, there will always be jerks among listeners, who will infer, even from expressions of parody and satire, a coded endorsement of their intolerance or hatred for some individual or group.
Does that mean, for instance, that comedian Dave Chappelle, because he has LGBTQ material in his show, is responsible if someone interprets his jokes as permission to harass members of that community? If anyone is paying attention to Chappelle, he endorses no such thing. Quite the opposite. But because we are talking about individuals who would go out of their way to harass people in the first place, neither Chappelle, nor any other creator, can possibly account for every mental malfunction that transpires between their jibes and some listener’s behavior. (It isn’t J. D. Salinger’s fault that Catcher in the Rye was allegedly inspirational to at least two murders and one attempted murder.)
So, with regard to traditional, pre-social-media, experiences (whence most of our views are derived), I suspect the reasonable answer is that the influencer is not responsible for the conduct of the follower. Returning to the example at hand, then, it seems difficult to cite Chris’s minimalist comedy—shifting from smile to deadpan in response to various surprise videos—as an inducement to harassing the gay and trans people featured in his Duets.
Given the way TikTok works (akin to Instagram in which “following” simply means the follower will see a creator’s videos from time to time while scrolling), it seems more likely that Chris’s followers unavoidably include at least some individuals who already spend their time hating on the LGBTQ community, and that his reaction pieces simply provided the names of some new targets. Any asshole who would go out of his way to mock, shame, or threaten someone in this way is not suddenly moved to engage in that behavior by one guy making a relatively anodyne joke.
At the same time, the extent to which Chris believes his schtick conveys a negative view of gay and trans people is a moral calculus he ought to consider. My GenZ, TikTok-using daughter showed me how a “duetter” can choose to hide the screen name of the person in the video to which they are responding. So, it is fair to ask why Chris did not employ this option (if he did not), which does not make the “duetted” party impossible to identify but does make them harder to locate without additional effort.
Social platforms will naturally generate Venn diagrams that are complex variations on the unremarkable fact that certain relationships between speakers and followers are often coincidental rather than causal. A video that made the rounds several weeks ago depicted a bevy of bouncing Boomers, festooned in their Trumpanalia and partying on a city street to the song “Y-M-C-A.” And because it would be absurd to interpret this dyspeptic dance as a celebration of gay pride, it is reasonable to assume that the relationship between the Village People and these village idiots was merely coincidental.
But what is unique about social platforms, of course, is not that they offer a forum for response, but that they insist upon a response. That blank space awaiting our input practically taps its foot in anticipation of some offering to the altar of “engagement.” And for many who create and/or comment, both praise and scorn, depending who is doing which, can serve as encouragement, even of speech that is very ugly. When some clown writes “Die faggot” at the gay TikToker, and others pile on in kind, these users are all receiving instantaneous and 24/7 positive reinforcement for their terrible conduct. This is a psychological stimulus unique to social media. It cannot be compared to the relationship between influencers and followers in other contexts. And, yes, to Goldberg’s point, the platforms bear the ultimate responsibility for designing and monetizing was is essentially a digital narcotic.
Almost every conversation about platform responsibility—many of which focus on the role of Facebook in contemporary politics—is a variation on the theme of policy measures. What can and should the platform do to moderate the content on its site? What can and should Congress do to oversee the influence of these platforms? The paradox inherent to those intertwined questions is self-evident. It seems hopeless to expect that the divisions sown by Facebook will be adequately addressed by a house divided.
But very few of our conversations about responsibility begin with the premise that social media, from a psychological perspective, is basically crack in the schoolyard. More specifically, because TikTok is predominantly a GenZer’s platform, many of the harassers referred to in Lorenz’s article are very likely minors. That does not wholly relieve them of responsibility for their conduct, but it colors the concept of “community standards.” Give a platform like that to millions of adolescents and tell them to play nice? What do we think is going to happen?
I do not know that the example of Chris’s reaction videos is especially helpful in this conversation. His content seems a bit too removed from the corresponding harm that was done, and it seems tame relative to the degree of ugliness that often flows freely on the internet. Further, it would be difficult to proscribe his videos according to some rule that would not apply to a great deal of parody and satire (though there may be more than is reported in Lorenz’s article).
As the title of this post admits, I do not think there are easy answers to these questions. It does seem that the rules on social media are acutely distinct from our pre-internet experiences, but defining those rules is another matter. Social media is a narcotic, and we all bear a measure of responsibility for its use. Meanwhile, it doesn’t seem quite right that, of all the parties involved, the digital drug-makers are left to profit from the best and worst consequences without any obligations.
© 2020 – 2021, David Newhoff. All rights reserved.