Really, DON’T Believe Anything You See on the Internet
When that cliché first entered our consciousness, it wasn’t really fair. The internet between the mid-90s and the mid-aughts wasn’t what it is today. It actually was just a dumb pipe through which content could could be delivered from creator to consumer in a new way. It was silly to imply that one should not believe a news story published by the Washington Post just because it was on a screen instead of paper — and that principle still holds true for most professional journalism.
But now, every legitimate news source swims in the same stream with all the garbage—from raw clickbait to lazy aggregators to hackers purposely trying to exploit underlying divisions in democracies—and the tools of manipulation are so sophisticated that many of the manipulators themselves don’t have to be. With a little practice using software that anybody can steal, a kid can create a video that makes it look like Hillary Clinton said that “all veterans are pussies,” and…well, here we are.
“One of the things I did not understand was that these systems can be used to manipulate public opinion in ways that are quite inconsistent with what we think of as democracy.”
That’s what Alphabet (Google parent company) Executive Chairman Eric Schmidt said, recently quoted in an article on FastCompany. And in keeping with the theme of this post, I don’t know what to believe. Were Schmidt and the rest of the leadership at Google honestly so drunk on their own utopian rhetoric about how wonderful their systems are that they failed to imagine—to say nothing of observe—how their products could be toxic for democracy? Or did they recognize it and not care until they were forced to care amid the fallout from the investigations into Russian meddling?
Facebook’s founding president Sean Parker—he was also the co-founder of Napster—told Mike Allen of AXIOS in a recent interview that Facebook was designed to “exploit a vulnerability in human psychology” in order to keep people on the site as much as possible. Parker told Allen that the creators of Facebook understood what they were doing and did it anyway, though perhaps did not quite imagine what the results would be when a billion people voluntarily spend hours in Zuckerberg’s ant farm. “…it literally changes your relationship with society, with each other … It probably interferes with productivity in weird ways. God only knows what it’s doing to our children’s brains.”
How much has changed in such a very short time. It seems like only yesterday the cheerleaders of Silicon Valley, with all the confidence of Camel-smoking doctors, kept telling us just how good their products were for democracy and for society overall. All this goodness was packaged into a single generic word innovation, and anything that stood in the way of innovation—like maybe the rule of law—was bad. Now, all of a sudden, we hear a lot of “Wow, we had no idea our systems could be used to totally fuck up the world! We’ll get some people on that right away!”
Of course, these companies either will not or cannot fully address the underlying reasons why their systems can be so toxic; and Parker put his finger on it when he admitted that Facebook was designed to take advantage of human folly. Facebook may clean up its act in certain regards—I actually believe Zuckerberg wants to—and Congress may enforce some effective regulations upon these platforms; but none of this will address the flaw in human nature that makes us more susceptible to emotional triggers than we are to reason and information. That’s why the underlying promise of the information age—that information can only have a moderating effect on discourse and interaction—is proving to be untrue.
There’s something fundamentally harmful about taking complex topics and issues and transforming it all into advertising, but that’s essentially what a platform like Facebook or Twitter does. “The sad truth is that Facebook and Alphabet have behaved irresponsibly in the pursuit of massive profits,” writes Roger McNamee for The Guardian. “They have consciously combined persuasive techniques developed by propagandists and the gambling industry with technology in ways that threaten public health and democracy. The issue, however, is not social networking or search. It is advertising business models.”
McNamee, who is identified as an early investor in Google and Facebook, describes how the advertising revenue models of these platforms drive, for instance, Facebook to deliver content based on user preferences, creating feedback loops called “filter bubbles.” People have been writing about the filter-bubble problem for several years now, but I suspect the problem is far too subtle to expect that the platforms themselves, with or without legislative mandates, will solve it.
Amid the recent flurry of allegations of sexual assault, satirical posts have appeared on Facebook with photos of Tom Hanks and leads saying, “Dozens of women come forward to…” And then, you click on the story, and it completes, “…say that Tom Hanks is a really nice guy.” Variations on this gag appear all the time, like the reports that Keith Richards is still alive. But you can bet the beer money that any number of people just scrolling through a feed on their phone, perhaps waiting in the supermarket line right next the old-school tabloids, will come away with the impression that indeed Tom Hanks was implicated in some sexual abuse claim. Then, the rumor gets repeated to a friend, and that’s more or less the state of “information” in the digital age. It’s the National Enquirer at “Google scale.”
According David Roberts, writing for Vox, America is in the middle of an epistemic crisis, suggesting that at least many citizens are beyond the problem of separating fact from fiction and are instead living in a world in which facts simply don’t matter. It is a mindset he calls “tribal epistemology—the systemic conflation of what is true with what is good for the tribe.”
For the time being, analysis of the online media universe reveals this problem is more prevalent on the political right (see support of Roy Moore even if he did assault a teenager), but the political left is hardly immune to this kind of tribalism. In fact, this blog was inspired five years ago when I witnessed this exact behavior among left-leaning friends, who were willing to share false information because it supported the outcome they believed to be right. So, although it is somewhat encouraging that this year marks the turning point when internet platforms will no longer be given a free pass — either by lawmakers or the public — to simply do what they want “for the greater good,” that hardly addresses how we individually and collectively will learn to cope with “God knows what’s happening to our brains,” as Parker puts it.
© 2017, David Newhoff. All rights reserved.Follow IOM on social media: