Did Social Platforms Really Find a Moral Compass?

In 2012, I wrote a post called In Defense of (a little) Elitism, which was naturally criticized by some in the tech-utopian world for being, y’know, elitist.

The apparent good in this digital-age model — that it is populist — is also its own weakness when we look at results in various media.  Most obviously, it doesn’t take more than a glance at the effects of extreme populism on journalism to realize that we now have news tailored to every taste — conservative, liberal, alternative, user-generated, subversive, and just plain wacko. No one can argue that the consumer isn’t “getting what he wants, and for free,” but the democratization of journalism has broadened the concept to include literally anyone with a computer. 

At that time, the likes of Alex Jones, Richard Spencer, terrorist groups, channers, The Daily Stormer, et al were well into metastasizing narratives of hatred and conspiracy, but few in the mainstream were talking about that incipient disaster, failing to truly grasp how digital platforms were extending, rather than shrinking, the influence of these toxic forces. If anyone questioned the reasonableness of giving those voices free platforms, Big Tech and its network of well-funded cheerleaders insisted that banning, or even muffling, these incubators of hate would do more harm to “free speech” than whatever harm was being done by leaving them alone.  

That was before 2016, of course, when the tin-foil hats, racists, and misogynists were not merely invited into the mainstream by the Party of Trump, but they were put front and center. Now, Silicon Valley had a problem. Battle lines were being drawn for the existential survival of the Republic (without which, there is no speech right, by the way). The longstanding official policy of “platform neutrality” would soon prove untenable. Nevertheless, until very recently, if a platform was criticized for hosting toxic content, the boilerplate answer was usually something like the following:

While we do not condone [vile content], we are reluctant to play the role of censors or arbiters of truth… [filler bullshit]… protecting free speech…[more filler bullshit]…and we believe democracy thrives from a robust exchange of ideas…[concluding bullshit]. (See Mark Zuckerberg speech at Georgetown University, October 17, 2019.)

Last week, that tone shifted, not altruistically mind you, but because the standard rhetoric was becoming a financial liability. ADWEEK announced that Reddit would be purging several hate-speech laden subreddits, including fan pages named for Donald Trump. While this is welcome news to many, I would remind readers that when Steve Huffman, a co-founder of the platform, assumed the role of CEO in 2015, he announced plans at that time to clean up Reddit’s act. So, I assume it is in response to the apparent sluggishness of said cleanup that he stated, “I have to admit I have struggled with balancing my values as an American and around free speech and free expression with my values and the company’s values around common human decency.”

Call me a cynic, but Huffman’s equivocation can only be read one way:  that toxic content is, at last, bad for business. Because it was only due to pressure from the some of the largest advertisers, either threatening to cancel, or actually cancelling ad buys that suddenly made it much more difficult for the big platforms to sweep all the Nazis and other assorted haters under the rug they liked to call the “exchange of ideas.” Not that that claim was ever anything but gibberish. If hate speech and incitements to violence are “ideas,” these were vetted long before we had the internet, and there is no principle whereby a social platform owes the KKK fresh digital soil in which to grow new roots.

Concurrent with Reddit dropping 2,000 hate-mongering subreddits, CNN also reported that YouTube finally jettisoned the channels of white supremacists Richard Spencer and David Duke, one year after promising to do so. The news channel states …

“Last year, CNN Business found that one Nazi channel YouTube had deleted before was back up and making no attempt to hide itself or its connection to its previously banned accounts. The channel was first taken down in April 2018 in wake of a CNN investigation that found ads from over 300 companies and organizations running on YouTube channels promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda.”

And finally, even the beleaguered Zuckerberg, whose relationship status with Donald Trump has been stuck on “It’s complicated,” finally caved (at least somewhat) to pressure from both major advertisers and his own employees.  The Washington Post reported

On Friday, Zuckerberg told employees in a live-streamed town hall that he was changing the company’s policy to label problematic newsworthy content that violated the company’s policies as Twitter does, a major concession amid the rising tide of criticism. He also said in the most explicit language ever that the company would remove posts by politicians that incite violence and suppress voting. Still, civil rights leaders said his assertions didn’t go far enough.

Facebook, Reddit, YouTube, and other platforms should have stopped providing aid and comfort to hate-mongers a long time ago, just because it was the right thing to do. But in the absence of actual principles, market pressure will suffice. In a broad sense, it is a hopeful sign that major corporations, despite some stumbling press releases, have recognized that there is no financial future when their brands are associated with the lingo of hatred and division. Especially because there is no sustainable nation in that agenda either.

This does not mean, of course, that the major internet platform managers have learned much of anything about the free speech folly they have perpetuated for the last two decades. Silicon Valley may appear to have located its moral compass last week (because it happened to be sitting on top of its wallet), but the rhetoric they maintain suggests that they still do not understand how their platforms have profoundly blurred the lines between speech and conduct. Technology reporter, Julia Carrie Wong, in an article for The Guardian published July 2, writes this about Facebook and Charlottesville:

“[Heather] Heyer’s killer has been convicted and sent to prison, but how does Facebook evaluate its role in the event? Does the calculation change at all when you consider just a few weeks before Charlottesville, I sent Facebook a spreadsheet with links to 175 neo-Nazi, white nationalist and neo-Confederate hate groups that were using itsplatform to recruit and organize? And that Facebook had declined to take any action against the vast majority of them until after Heyer’s murder, when it belatedly cleaned house?”

For her efforts as a journalist (remember journalists?), Wong was of course targeted on the same social platforms, weaponized by the same people she had exposed to Facebook. As she very courageously describes …

“The neo-Nazis and white nationalists I had written about published articles with my photograph that described me as a ‘racial molotov cocktail’ with ‘the cunning of the Jew and the meticulous mathematical mind of a Chink’. They encouraged their followers to go after me too, and I received a steady stream of racist vitriol on Twitter, on Facebook and by email. I tried to ignore it as much as I could. I tried not to ruin Thanksgiving. The worst were the messages that referenced my family, or imagined my rape.”

For as long as I have been writing about these issues (since 2011), descriptions of harassment like Wong’s have either elicited an eyerolling mansplanation as to why we should not take these things so seriously, or an insincere empathy that boils down to “That’s the price we pay for free speech.” Bullshit.

As long as the major platforms are being financially pressured to shed toxic material from their sites, they should take the opportunity to drop all the “conflicting values” rhetoric while they’re at it. Nobody asked these constitutional dilettantes to be stewards of the speech right. It was arrogant of them to presume to play the role of public guardians of civil liberties, especially while providing resources to opponents of those same liberties. They run advertising platforms. And they have no reason to equivocate about, or apologize for, taking out the garbage.

David Newhoff
David is an author, communications professional, and copyright advocate. After more than 20 years providing creative services and consulting in corporate communications, he shifted his attention to law and policy, beginning with advocacy of copyright and the value of creative professionals to America’s economy, core principles, and culture.

Enjoy this blog? Please spread the word :)