Your Narrative About TikTok is Probably Wrong (but so what?)

tiktok

One story that trended (e.g., on BlueSky) about TikTok’s day of shutdown and revival can be summarized thus:  the intent to ban TikTok was a stunt cooked up by Republicans so that Trump could pretend to save it at the last minute. Thus, it was never about national security but was yet another grab of another platform for hard-right ideologues—and an opportunity for Trump and/or his friends to benefit financially.

I get why that seems rational, but it doesn’t quite square with the facts. Before describing those facts, however, let’s acknowledge and set aside a few assumptions based on emotion.

First, it is only natural for TikTok account holders to feel angry at the thought of losing a platform they enjoy or use for business or core communications.  When something we like might be taken away as an act of law—let alone a forum used to express oneself or make a living, it feels like an unwarranted attack on one’s interests and civil rights. This remains the unresolved paradox of all social media platforms:  that holding them accountable is perceived by one group or another as an abridgement of the rights of the platform users.

Second, Trump lying is a universal constant. He lies so often that it would be impossible for him not to contradict himself on a broad range of topics. Hence, the fact that he once said “ban TikTok,” then used TikTok for his own purposes, and then claimed to save TikTok is just his standard operating bullshit. “Biden tried to kill it, but I’m going to save it,” is one of a million sound bites or posts (amplified by asshats like Charlie Kirk) that have little to do with what happened or is likely about to happen.

Third, TikTok’s own messaging thanking Trump etc. cannot be taken at face value. The company is acting in its own interests, as any business would. Of course, it was pure theater when the site popped back on and thanked Trump for the 90-day stay of execution while they work out a “deal.” And naturally, many TikTok users will only be glad the platform is still running and either not care or necessarily believe why lawmakers acted in the first place.

The Real Story (most likely)

No later than early 2024, both Democrats and Republicans in Congress were provided security briefings on TikTok and its relationship to the Chinese Communist Party (CCP), an adversary of the United States. Members of both parties were deeply concerned about what they learned, and thus, in early March, the House passed HR 7521 with a vote of 362 to 55. The bill had 54 co-sponsors—22 Democrats and 32 Republicans. After the bill was signed by President Biden, the law set a roughly ten-month deadline for TikTok to be sold to an entity without ties to the CCP to avoid being banned in the U.S.

So, the first point worth making is that if Trump & Co. orchestrated the TikTok law as a stunt, they did it with the cooperation of a lot of Democrats, including President Biden. Instead, it is more reasonable to assume (though admittedly difficult these days) that the TikTok bill was the result of bipartisan cooperation on a matter of national security. Notwithstanding political rhetoric by individual Members—let alone sniping from the edges by Trump—the law itself was well founded, and it is worth noting that the Republicans who supported the law could not be certain that Trump would be re-elected and, therefore, have the opportunity “rescue” TikTok, as alleged.

While I have no more inside information about those security briefings than any other observer, the most rational conclusion is that Congress had good reason to pass the TikTok law, which the Supreme Court—albeit at the 11th hour—unanimously held was not in fatal conflict with the First Amendment. This outcome, which I advocated in an earlier post, does not support the narrative that the ban was a stunt cooked up by Trump and loyal Republicans so that TikTok could be recruited along with Facebook, X, and Google as another social platform of the oligarchy.

Importantly, that narrative misses the point that just because people only lately discovered that Big Tech’s politics are oligarchical, that doesn’t make it news. The sight of Zuckerberg, Musk, Pichai, and Bezos on stage with a mad monarch like Trump was written into Silicon Valley’s Terms of Service a long time ago. TikTok is no different but for the fact that its other anti-democratic master happens to be the CCP.

What Now?

While I endorsed the rationale for the TikTok law, I am acutely aware that even if it were banned on the basis of adversarial foreign control, this would have been a remedy of closing the barn door long after the cows escaped, drowned in the lake, and the lake froze over. The adversarial effects of all social media on American democracy not only remain unaddressed, but Trump & Co. are direct beneficiaries of the kind of targeted propaganda social sites make possible. In other words, whether adversaries of American interests are foreign or domestic, mission accomplished. Chaos sown. You are here.

Just like the social platforms, Trump also disguises his personal interests as American interests, and whatever “deal” he makes to keep TikTok in the U.S. cannot be trusted. “America First,” is an Orwellian slogan—used to animate mean-spiritedness while advocating policy that directly undermines American interests, including national security. Regarding TikTok, then, there is no reason to believe that Trump & Co. give a rat’s ass whether the CCP remains tied to the platform unless that relationship is damaging to Trump & Co. personally—a group that now includes the lately recognized tech oligarchy.

Within that morass, competing narratives will continue to flow based on ideology and emotion—all feeding the social media beast while pretending to tame it. Whatever becomes of TikTok in the next three months, public perception is unlikely to match reality, which paradoxically proves both the utility and futility of the law that was designed to force its sale.

Pass the TikTok Legislation. And then…

TikTok legislation

“At what point then is the approach of danger to be expected? I answer, if it ever reach us, it must spring up amongst us. It cannot come from abroad. If destruction be our lot, we must ourselves be its author and finisher. As a nation of freemen, we must live through all time, or die by suicide.” – Abraham Lincoln, The Lyceum Address, 1838 –

Lincoln’s famous observation that only Americans can truly destroy America speaks to the fragility of the Republic, which the founders knew could only endure so long as the people generally keep faith with certain core principles. Watching those principles assaulted by a far-right populism, which has presently swallowed the Republican Party, it is natural to read Lincoln as prophetic, and it is hard to imagine any foreign influence being more dangerous. On the other hand, when Lincoln said, “It cannot come from abroad,” he could hardly have imagined a time when 170 million young Americans would carry a pocket surveillance device loaded with software under the control of a foreign adversary.

Following the 362-55 vote by the House to force TikTok to divest itself of all ties to the Chinese Communist Party (CCP), opinions about the bill question both its necessity and viability—though not with good reason. Although rashly described as a “ban,” the effect of H.R. 7521 would force a sale of the platform by parent company ByteDance to an owner without ties to the CCP. To that end, I agree with independent musician Blake Morgan. who endorses the TikTok legislation, both as a national security and anti-piracy measure. In an editorial for IP Watchdog, Morgan writes:

The vast majority of music on TikTok generates virtually no revenue for the musicians who made it, and even more music on the platform is completely unlicensed (stolen), copied (stolen via AI), or pirated (stolen). Simply put, TikTok is trying to build a music-based business without paying music makers fair value for the music. That’s why Universal Music Group has already pulled out of TikTok. That’s why the National Music Publishers’ Association has already announced it won’t renew its license with the company. So, TikTok poses “a clear and present danger” to American music, too.

The music piracy alone is reason to force the platform to operate within the reach of U.S. law, but with regard to the national security threat, it is notable that unless one is in the intelligence community, or a Member of Congress receiving a security briefing, we are left to rely upon one of those core principles, which have been eroded by social media in general:  trust. I do not endorse the Whatabouist’s view that just because TikTok is not alone in causing havoc that this legislation is moot, but the story does highlight those hazards of social media that make it difficult to convince many Americans that TikTok is a threat of any kind.

Joseph V. Amodio, writing for Tanium, states that TikTok is distinguishable from other platforms thus:

TikTok stands out in its power to manipulate: While videos from any app can go viral, TikTok’s infection ability is unique, given the practice of “heating,”  where TikTok staff can supercharge distribution of hand-picked videos. This has huge implications for fair competition and free trade. Just imagine how they can siphon profits by amplifying your competitors’ posts or cooling down your own viral campaigns.

Whether the goal of data manipulation is to pull the levers on enterprise, as Amodio indicates, or to influence young voters on policy matters, how does one convince nearly 200 million 18 – 29-year-olds that said manipulation is both occurring and should be seen as an attack? If an act of cyberwarfare entails hacking the Pentagon or shutting down part of the power grid, enough Americans can probably recognize such events as attacks in a traditional sense. Likewise, the prospect of malicious software injected into millions of mobile devices might be understood as a threat.

But what if the weapon is an insidious propaganda tool used to manipulate the opinions of millions of citizens? Who is going to be trusted to identify that as a sustained attack on the United States? Some portion of the TikTok demographic will not believe that China (or Russia) is an adversary in the first place, which is arguably evidence itself of social media’s power to influence.

Even if the delivery platform is owned by Meta serving “ads” purchased by foreign operatives with the same objective to sow discord, no individual wants to believe he’s being manipulated. More complexly, even if one tries to apply critical thinking, the effort itself is often countered by teams of data manipulators flooding the zone—i.e. the illusion of more “information” tilting bias in one direction or another. This was true before parties like China and Russia upped their cyber game and before they could add artificial intelligence to the toolset.

As a practical example at the heart of the TikTok story, how does the moderate, who would rather not hyper-politicize national security, take the contemporary Republican seriously in his professed opposition to TikTok’s capacity to “manipulate” Americans? For instance, Rep. Ralph Norman of South Carolina writes, “…if you’ve spent 5 minutes exploring TikTok, you should have recognized the addictive nature of this platform. It is designed for one purpose: to control your attention. Their algorithm quickly figures out what kind of videos you’re likely to watch, and then feed you similar videos to keep you fixated.”

Fine. But one could swap “TikTok for “Trump” and make the same general argument, including that his self-interested rhetoric about NATO, disrespect for the Constitution, etc. all comprise a threat to national security. What would Lincoln say to his legacy party about this tangled interplay between foreign and domestic forces, both hostile to American interests, and both weaponizing disinformation through addictive and manipulative platforms?

In this context, it is important to note that Trumpism is a symptom of populism—a trend that is no less prevalent on the left than on the right, perhaps especially among 18 – 29-year-olds. The difference, for the moment, is that the left has not found its own cult-like figure, who might also undermine core principles, albeit in a different style than Trump. The rise in populism in the U.S. and other democracies is a direct result of social media’s nature to factionalize hearts and minds, which is precisely what a foreign adversary wants to achieve. TikTok may be a shrewdly named time-bomb delivered to over half the U.S. population and, as such, should be diffused. But assuming that task can be accomplished, the existential question remains as to whether we can quarantine the most virulent effects of all social platforms or “die by suicide.”

Influencer Responsibility Raises Many Questions With Few Answers

Attorney Carrie Goldberg posted an interesting thread on Twitter last week in response to a September 4 New York Times article written by Taylor Lorenz. Before I go on, if you are unfamiliar with Goldberg’s work, her Brooklyn law firm defends victims of online harassment, and she has deservedly become one of the nation’s thought leaders on the subject of internet platform responsibility and liability. As I said in my praise for her book Nobody’s Victim, if anyone is going to help sensibly reform the over-broad application of Section 230 to the benefit of its unintended victims, it will be women like Carrie. But for the purposes of this post, I will stay away from legal liability and stick to questions of social conduct.

The headline of Lorenz’s article poses what Goldberg called the “million-dollar question”: Are Influencers Responsible for the Behavior of Their Followers? The article focuses on a 17-year-old TikToker named Chris (@Donelij) who had been making what users of that platform call “reaction” videos. These are created with the app’s “Duet” function, which enables a creator to capture himself in a split-screen alongside a video posted by any other TikToker. In Chris’s case, he “duets” with videos that reveal some shocking or surprising twist so that his viewers get to watch his reaction when he shifts from approving smile to bemused deadpan. This recurring schtick earned @Donelij over two million followers.

But because Chris performed his deadpan bit in response to a number of transgender TikTokers, whose videos reveal some variation on a surprising gender switch, Chris’s videos apparently inspired some of his viewers to antagonize or threaten the LGBTQ TikTokers featured in the Duets. “Even though he never says a word,” writes Lorenz, “thousands of people have called [his videos] out for being homophobic. Young gay and trans TikTokers who have been featured in Chris’s reactions report they have suffered vicious harassment from commenters and in messages.”

Lorenz reports that, last week, both of Chris’s accounts were banned by TikTok due to “multiple community guideline violations.” Assuming the reaction videos and their associated fallout with the LGBTQ TikTokers was the reason for these terminations, Lorenz poses her thesis question about responsibility. To this, Goldberg offered a tweet thread describing her views regarding both the moral and legal responsibilities of creators, followers, and platforms, placing most of the burden, when harm is done, on platforms.

In general, I agree with Carrie. Many of us would like to reverse, or at least reassess, the longstanding assumption that online platforms are strictly neutral conduits, and that users are the only responsible parties for anything. At the same time, however, this particular example strikes me as difficult to reconcile with regard to questions of responsibility, though it does point to the fact that our frame of reference (i.e. old media) falls short when grappling with the true nature of social platforms. Let’s step back for a moment.

All speakers bear some measure of responsibility for the consequences of their speech. While I do not think the contours of that responsibility are easily generalized, it is morally incumbent upon any speaker to at least acknowledge that speech has the power to inspire conduct and to at least contemplate what that conduct might be. Regardless, there will always be jerks among listeners, who will infer, even from expressions of parody and satire, a coded endorsement of their intolerance or hatred for some individual or group.

Does that mean, for instance, that comedian Dave Chappelle, because he has LGBTQ material in his show, is responsible if someone interprets his jokes as permission to harass members of that community? If anyone is paying attention to Chappelle, he endorses no such thing. Quite the opposite. But because we are talking about individuals who would go out of their way to harass people in the first place, neither Chappelle, nor any other creator, can possibly account for every mental malfunction that transpires between their jibes and some listener’s behavior. (It isn’t J. D. Salinger’s fault that Catcher in the Rye was allegedly inspirational to at least two murders and one attempted murder.)

So, with regard to traditional, pre-social-media, experiences (whence most of our views are derived), I suspect the reasonable answer is that the influencer is not responsible for the conduct of the follower. Returning to the example at hand, then, it seems difficult to cite Chris’s minimalist comedy—shifting from smile to deadpan in response to various surprise videos—as an inducement to harassing the gay and trans people featured in his Duets.

Given the way TikTok works (akin to Instagram in which “following” simply means the follower will see a creator’s videos from time to time while scrolling), it seems more likely that Chris’s followers unavoidably include at least some individuals who already spend their time hating on the LGBTQ community, and that his reaction pieces simply provided the names of some new targets. Any asshole who would go out of his way to mock, shame, or threaten someone in this way is not suddenly moved to engage in that behavior by one guy making a relatively anodyne joke.

At the same time, the extent to which Chris believes his schtick conveys a negative view of gay and trans people is a moral calculus he ought to consider. My GenZ, TikTok-using daughter showed me how a “duetter” can choose to hide the screen name of the person in the video to which they are responding. So, it is fair to ask why Chris did not employ this option (if he did not), which does not make the “duetted” party impossible to identify but does make them harder to locate without additional effort.

Social platforms will naturally generate Venn diagrams that are complex variations on the unremarkable fact that certain relationships between speakers and followers are often coincidental rather than causal. A video that made the rounds several weeks ago depicted a bevy of bouncing Boomers, festooned in their Trumpanalia and partying on a city street to the song “Y-M-C-A.” And because it would be absurd to interpret this dyspeptic dance as a celebration of gay pride, it is reasonable to assume that the relationship between the Village People and these village idiots was merely coincidental.

But what is unique about social platforms, of course, is not that they offer a forum for response, but that they insist upon a response. That blank space awaiting our input practically taps its foot in anticipation of some offering to the altar of “engagement.” And for many who create and/or comment, both praise and scorn, depending who is doing which, can serve as encouragement, even of speech that is very ugly. When some clown writes “Die faggot” at the gay TikToker, and others pile on in kind, these users are all receiving instantaneous and 24/7 positive reinforcement for their terrible conduct. This is a psychological stimulus unique to social media. It cannot be compared to the relationship between influencers and followers in other contexts. And, yes, to Goldberg’s point, the platforms bear the ultimate responsibility for designing and monetizing was is essentially a digital narcotic. 

Almost every conversation about platform responsibility—many of which focus on the role of Facebook in contemporary politics—is a variation on the theme of policy measures. What can and should the platform do to moderate the content on its site? What can and should Congress do to oversee the influence of these platforms? The paradox inherent to those intertwined questions is self-evident. It seems hopeless to expect that the divisions sown by Facebook will be adequately addressed by a house divided.

But very few of our conversations about responsibility begin with the premise that social media, from a psychological perspective, is basically crack in the schoolyard. More specifically, because TikTok is predominantly a GenZer’s platform, many of the harassers referred to in Lorenz’s article are very likely minors. That does not wholly relieve them of responsibility for their conduct, but it colors the concept of “community standards.” Give a platform like that to millions of adolescents and tell them to play nice? What do we think is going to happen?

I do not know that the example of Chris’s reaction videos is especially helpful in this conversation. His content seems a bit too removed from the corresponding harm that was done, and it seems tame relative to the degree of ugliness that often flows freely on the internet. Further, it would be difficult to proscribe his videos according to some rule that would not apply to a great deal of parody and satire (though there may be more than is reported in Lorenz’s article).

As the title of this post admits, I do not think there are easy answers to these questions. It does seem that the rules on social media are acutely distinct from our pre-internet experiences, but defining those rules is another matter. Social media is a narcotic, and we all bear a measure of responsibility for its use. Meanwhile, it doesn’t seem quite right that, of all the parties involved, the digital drug-makers are left to profit from the best and worst consequences without any obligations.