Facebook and Big Tech’s “Big Tobacco” Moment

Frances Haugen testifies before the Senate Committee on Commerce, Science, & Transportation, Subcommittee on Consumer Protection, Product Safety, and Data Security. 10/5/2021

In response to the breaking news on Sunday that Facebook’s latest, and perhaps most consequential, leaker identified herself as former employee Frances Haugen, the questions are being asked once again:  How much do we blame Facebook, and for what shall it be blamed? For instance, in response to the allegation that the social platform played a role in the insurrection of January 6—both as an amplifier of disinformation and as a communications hub for some of the premeditated actions of that day—spokesperson Nick Clegg responded that it is “ludicrous” to blame Facebook. “The responsibility for the violence of Jan. 6 lies squarely with the people who inflicted the violence and those who encouraged them, including President Trump,” Clegg told CNN.

Clegg is dutifully responding to a straw man by reframing the accusation, as if Facebook were being accused of direct responsibility for the assault on the Capitol. In reality, of course, the company is accused, most recently by Haugen, of either ignoring or obfuscating evidence that its operational decisions are conducive to terrible outcomes for both individuals and whole societies. The company has allegedly engaged in willful blindness with respect to its role in aggravating different forms of suicidal tendencies—among teenagers being negatively affected by Instagram, and among adults negatively influenced by disinformation to the point of assaulting the constitutional order of the United States.

Haugen, who testified with tremendous poise on Tuesday before the Senate Commerce Committee, is a data scientist initially hired by Facebook as a member of the “civics integrity team.” She leaked tens of thousands of documents and stepped into the light, at considerable personal risk, with the intent to prove to legislators, federal agencies, and the public that when Facebook leadership is presented with evidence that its operational decisions cause harm, it will consistently choose profit over the mitigation of that harm. “Haugen has also detailed how she says Facebook quickly disbanded its civics integrity team—responsible for protecting the democratic process and tackling misinformation—after the 2020 U.S. election. Shortly afterward, came the Jan. 6 insurrection at the U.S. Capitol, in which organizers used Facebook to help plan,” writes Jaclyn Diaz for NPR.

That Facebook will behave like many other corporations (i.e. protect its bottom line) is not a revelation. At least, it shouldn’t be. Neither should there be any doubt that we are still wandering uncharted territory when a private company needs a division to be “responsible for protecting the democratic process and tackling misinformation.” Haugen’s testimony that Facebook maintained such a unit for the shortest time possible is damning, but the fact that we have collectively and voluntarily ceded so much power to a social media company is the bigger problem. And many of the consequences of that transformation cannot wholly be fixed by “fixing” Facebook.

The bipartisan committee members who questioned Haugen sounded unanimous in their intent to take legislative action soon, especially in response to evidence that Facebook is aggravating health risks to teens and tweens. Senators Blumenthal and Markey have already introduced the KIDS Act, which would proscribe the use of various “interface elements” that would manipulate a minor’s experience on a given platform. In that sense of “fixing,” the Big Tobacco metaphor applies because we can associate Facebook’s lack of transparency with identifiable health risks like eating disorders and depression. Meanwhile, in terms of our collective mental health as a society, I am not sure why the same prohibitions should not exist for adult users, who also do not recognize that social media is a narcotic—one that can produce good feelings even from very bad conduct.

Just yesterday, I saw that a woman, whose work I admire on constitutional issues, was harassed on Facebook by a stranger who did not engage her to debate the Second Amendment but merely to unpack his favorite sexist pejorative and tell her to kill herself. If the incident were reported, Facebook is unlikely to cancel the guy’s account, especially when there are tens of millions of customers just like him. So, not only has the great “information revolution” failed to produce a more nuanced—let alone historically informed—discussion about 2A et al, but Facebook exacerbates the worst behaviors by providing users with the little dopamine hit that comes from self-righteous, remote-control harassment.

It was not very long ago that examples like this would elicit a big eyeroll from the bro-culture of what we used to call netizens—not only because the conduct was presumed to be anomalous, but because cyberspace was presumed to be innocuous. Just words rather than sticks and stones. That was false. It was clear to many observers that the increase in anti-social and indecent conduct online was spilling over into the so-called real world. The boundary between clicks and sticks was steadily being eroded and, as it became clear on January 6, that boundary no longer exists at all for many of us.

Every time Zuckerberg or someone representing Google or Twitter or the EFF et al has asserted free speech as the rationale for an unregulated, barely moderated internet, they have been making the argument, however unwittingly, that anarchy works. Let everything flow, and people will make rational choices, and the good will outweigh the bad. That was the prevailing argument before 2016 and the so-called techlash, and it is an argument which is still being revived despite all evidence that, as a social experiment, it has been a disaster.

Miss Haugen’s testimonies are compelling and will likely be catalytic to long-overdue change at Facebook and elsewhere in the industry. The most significant discussion to emerge this week may be the proposals, including by FCC Chairman Wheeler, to create a new federal agency charged with oversight of major internet platforms. Whatever comes next, I think the era of laissez-faire appears to be over for Big Tech, and that is at least a step in the right direction.

David Newhoff
David is an author, communications professional, and copyright advocate. After more than 20 years providing creative services and consulting in corporate communications, he shifted his attention to law and policy, beginning with advocacy of copyright and the value of creative professionals to America’s economy, core principles, and culture.

Enjoy this blog? Please spread the word :)