One Movie Settled the “Debate” About Climate Change

When I saw the theme of this year’s World IP DayInnovate for a Green Future, I will admit that it was hard not to be cynical. In light of the reinvigorated political assault on science—let alone to be thinking about climate change in the middle of a pandemic—it is tempting to believe that the debate about global warming still rages—or has even been lost. But that’s not quite true. The debate was settled a very long time ago. Or to be more precise, there is no such thing as a debate about scientific evidence, there is only understanding, misunderstanding, willful ignorance, or malignant deception. 

While it is stultifying to see that a truly vindictive brand of ignorance and deception are the cornerstones of the present administration, there remains one avenue of hope for at least mitigating—because it is almost certainly too late to reverse—the effects of global warming. Oddly enough, that avenue of hope has more to do with market dynamics than environmentalism per se, and I would assert that it was a single documentary film that opened the window to a market-based transformation, which, even now, represents a path forward. I am of course talking about An Inconvenient Truth.

An unlikely movie pitch, the centerpiece of the documentary is former Vice President Al Gore presenting his climate change “slide show,” which he had developed over several years after he was first introduced to the science in college in 1966. Not long after conceding the painfully-contested presidential election in late 2000, Gore devoted himself fully to the climate issue, taking his laptop and talking points on the road, offering free admission to anyone willing to listen to him discuss the fate of the planet.

“The slides were originally black and white,” says the film’s co-producer Lawrence Bender, whom I interviewed for this article. “They weren’t visually appealing, but they were almost scarier, like something you’d see in a science lab, when we first saw Al’s presentation in Los Angeles.” Bender and others who would eventually join the production team were invited by producer Laurie David (now Lennard), who had arranged for Gore to come to L.A. after she found herself captivated by his lecture in New York in May 2004.

“Gore’s show left us with a sense of urgency about the issue,” says Bender. “We knew we had to make what we had seen into a movie, but it was not easy to convince many people in the business that it was a movie. Try telling someone you’ve got former VP Al Gore, who lost the election, doing a slide show about science, and that you need a million dollars.” Enter Jeff Skoll, who founded Participant Media in 2004 with the fortune he had made as eBay’s first employee and first president. “Jeff financed the whole production without blinking an eye,” Bender tells me.

Less than a year after that initial presentation in Los Angeles, An Inconvenient Truth was ready for the screen. It became an international blockbuster (for a doc), earning two Academy awards, one for Best Documentary Feature, the other for Best Song, “I Need to Wake Up” by Melissa Etheridge. And for any cynics, who may be tempted to criticize the movie as a vanity project—Hollywood glamor with little substantive effect—I would direct your attention back to the 1990s and early 2000s.

Waking Up Tens of Millions

Hurricane Katrina. August 28, 2005. NASA

When the Kyoto Protocol was ratified in 1997, calling for a modest 5% reduction in greenhouse gasses by developed nations, global warming was not an especially bright blip on the public radar screen. General perception, such as it was, loosely divided along the left/right political lines that are usually drawn through environmental issues; but overall, the average citizen (and quite a few politicians in both parties) could be described as somewhere between ambivalent and unsure about the alleged causes or effects of a warming climate.*

It probably did not help that this was the same period when we all first logged onto the internet, which would prove to be a wonderful tool for obtaining information and disinformation at the same time. And to be sure, the extractive industries, and other vested interests bound to fossil fuels, were eager to provide erudite sounding counter-narratives to the mountain of evidence proving that human activity was in fact changing the climate in dangerous ways. Then, on January 24, 2006, An Inconvenient Truth debuted at the Sundance Film Festival.

Directed by Davis Gugenheim, the film’s most effective quality, in my view, was that it reintroduced the purportedly “wooden” politician Al Gore as a relatable, flesh-and blood human being, whose humor and humility rescues the didactic lecture from becoming either dry or a ninety-minute scold. Upgrading Gore’s visual aids to high-resolution slides using Apple Keynote certainly provided enough color and scope to fill the big screen, but the critical element was Gore’s humanity. 

“Davis was adamant that the film had to work emotionally,” says Bender. “It’s a deceptively simple movie, but we spent a lot of energy in post-production trying to find the right balance between this man’s personal journey and the science.” By interweaving Gore’s presentation with glimpses into his life story—anecdotes in which he admits his own frailties and errors—the overall result of the film was that it turned carbon dioxide into a kitchen-table issue. And that was the significance of An Inconvenient Truth.

Seemingly overnight, as a direct result of the movie’s success, concepts like “carbon footprint” entered mainstream conversation and classroom curricula across the U.S. and abroad. While the opposition was by no means silenced, the film awakened enough public consciousness that multiple business segments suddenly needed to respond to a new consumer demand to “go green.” 

Consumer Change Leads to Corporate Change

To be sure, not all business initiatives were substantive, but by and large, the mandate to promote green led to tangible and lasting changes in corporate culture and governance. Sustainability went from a crunchy, esoteric notion to a board-room best practice, and this, in turn, spawned new investment in the development of alternative and more efficient energy solutions. “Practically every Fortune 500 company has a sustainability officer or sustainability program today, and that was not true fifteen years ago,” says my longtime friend Jeff Turrentine, a writer and editor for On Earth, the publication of the Natural Resources Defense Council. 

An Inconvenient Truth was not the first conversation about the economics of sustainability, and Gore was hardly alone in asserting that carbon reduction, aside from being existentially mandatory, is compatible with economic growth. Many environmental experts, technology innovators, and political leaders (even bipartisan ones) had a solid grasp on the two uncontroversial facts about carbon mitigation: 1) that burning less fuel saves money and is, therefore, profitable; and 2) that green innovation represented a whole new sector of untapped economic opportunity.

That conversation was already taking place in various pockets in the both the public and private sectors for at least a decade or more before An Inconvenient Truth was released. But the film gets credit for igniting those latent sensibilities in the minds of the general public and for spawning the aforementioned consumer demand for change. The movie was catalytic in fostering market conditions in which multiple industries and municipalities discovered what many environmentalists had tried to explain for years—that working to reduce greenhouse gas emissions happens to be good for business. 

So, while the Trump administration has arrogantly stumbled backwards on environmental policy—evangelizing climate science denialism out of sheer spite—the green investments made by both the private and public sectors over the last decade and a half are unlikely to be reversed—especially when those investments are yielding positive returns. It is still not enough, but it is most likely where the best hope still remains. And perhaps there is no better example of this paradigm than the city of Georgetown, Texas, featured in An Inconvenient Sequel: Truth to Power (2016). 

Mayor Dale Ross proudly tells Gore, on camera, that his city is powered by 90% renewable energy (at the time of filming), despite being “the reddest city in the reddest county in Texas.” Why? Because, to paraphrase Ross, it saves his constituents money, and because you don’t need to be a scientist to understand that less pollution in the air is a good thing. This is why I will argue that An Inconvenient Truth went beyond merely “raising awareness.” It directly created a public mandate that led to the kind of common sense approach taken by Ross, who reminds us that there is nothing “conservative” about waste or higher prices.

The countless market effects that can be attributed to a single film—in which the information was neither new nor hard to grasp—remind us that creative expression is essential. In a time when IP deniers argue that copyright functions solely as a barrier to information, the story of An Inconvenient Truth belies the naïve, tech-utopian assumption that access to information alone is sufficient—least of all when utter nonsense gallops across digital platforms like a fifth horse of the apocalypse. Facts alone do not speak meaningfully to people. Invariably, it takes creativity to inspire us, even when it comes to saving our own lives. 


*It must be acknowledged that the climate issue had Republican champions in those days, and there is an extent to which Gore, as the most prominent messenger, became a more attractive political target after the 2008 election, when the GOP became more dependent on the fossil fuel industries.


Photos: “Al Gore” Lisbon, 2017. By G Holland.

“Earthrise” Apollo 8, December 24, 1968. NASA.

The Precarious Politics of Reigning in Silicon Valley

As our attention turned to concerns about disinformation, hate speech, and data security after the 2016 election, it became clear that the big cyber policy on deck was going to be a fight about Section 230 of the Communications Decency Act (1996).  For some detailed discussion about this legislation, see posts here, here, and here; but in nutshell, Section 230 shields online platforms against liability for potential harm that may result from the conduct of its users.  It is occasionally and improperly associated with copyright infringement, from which platforms are largely shielded by Section 512 of the DMCA (1998). 

Although 230 was never intended to provide blanket immunity for all sites hosting any kind of user-generated content, most courts over the 24 years since the law was adopted have interpreted it as a blanket immunity for all sites hosting any kind of user-generated content.  This includes content that may be posted for the express purpose of causing harm like harassment, defamation, revenge porn, fraud, or disinformation.  230 is the statutory reason why site owners respond with a shrug or, at best, a feeble explanation for hosting material that goes beyond mere offense, as we have seen its power to alter truth itself.  If you were mystified, for instance, by Zuckerberg’s sphinx-like reasoning that Facebook would maintain Holocaust denial pages because they are merely “misinformation and opinion” rather than “hate speech,” that was just one manifestation of the ideological flaw, which helped write Section 230 two decades ago.   

“We were naïve. We were naïve in a way that is even hard to recapture. We all thought that for people to be able to publish what they want would so enhance democracy and so inspire humanity, that it would lead to kind of flowering of creativity and emergence of a kind of a collective discovery of truth.”

Those are the words of former FCC Chairman Reed Hunt lately expressing regret for the adoption of Section 230, clearly identifying the erroneous underlying premise, which many critics now refer to as tech-utopianism.  And while it is somewhat encouraging to finally see a greater appetite for holding platforms accountable for some of their ill-effects, this mood change is anything but clearly definable.  Instead, we hear cacophony of disparate—even competing—rationales for reigning in Big Tech, and if this chaos cannot manifest as rational policy, Big Tech may win the status quo, which they spare no expense trying to maintain.   

For example, voices as incompatible as Vice-President Joe Biden and Senator Ted Cruz have both raised the specter of abolishing Section 230, but for very different reasons.  Biden and others see the liability shield as encouraging a platform like Facebook to continue hosting false information (e.g. Holocaust denial), while Cruz and other Republicans complain that social platforms are biased against conservatives.  But good luck trying to reckon with the devil in those details.

Would Biden include headlines or stories from left-leaning organizations that are inaccurate?  Would Cruz consider social media platforms removing Alex Jones, or the hosting providers dropping The Daily Stormer as examples of anti-conservative bias these days?  It becomes easy to imagine how a pragmatic and sober debate about Section 230 can get lost amid the inherent tribalism implied by just those two voices alone.

From a very different sector, David McCabe reports for the New York Times that a “motley” group of corporations, including Disney, IBM, and Marriott, are gunning for Section 230. “The companies’ motivations vary somewhat,” writes McCabe.  “Hollywood is concerned about copyright abuse, especially abroad, while Marriott would like to make it harder for Airbnb to fight local hotel laws. IBM wants consumer online services to be more responsible for the content on their sites.”

As prefaced above, note that even The New York Times will erroneously include copyright in a conversation about Section 230, though in fairness, the underlying principle—namely that no platform should ever be responsible for material published by users—is fundamentally the same in 230 as the DMCA’s 512.  Still, let us assume that especially because the Times used “Mickey Mouse” in the headline, this story will be interpreted by many as “Copyright maximalist Walt Disney Company wants to break the internet again,” or something to that effect.  And viola!  We are no longer having a conversation about platform responsibility. 

In a similar vein, the Center for Democracy and Technology published an article on its site criticizing a proposal introduced by Sen. Graham to combat child sexually abusive material online; and the article and associated tweet exploits distrust for both Graham and Attorney General Barr as reasons to fear the proposal itself.  Sure, I personally think Sen. Graham is the most prominent wuss in America today; and Bill Barr is batshit crazy, spluttering his views that people without religion lack moral judgment, but …

I don’t trust the folks at CDT either because they are ideologues too—OG tech-utopians who just happen to receive significant funding from Google.  (That, and I am very much opposed to child sexually abusive material.) So, whether the harm that needs addressing is child exploitation, revenge porn, online harassment, or mass disinformation campaigns, if we want to cope with any of these still somewhat novel challenges, we just might have to entertain the possibility that a sound policy proposal will come from some party we do not like in a different political context.

The subtle irony in this last example, of course, is that the folks at CDT would probably never entertain the notion that blanket platform immunity has been a major catalyst to creating the alternate realities that people like Graham and Barr now occupy.  That’s not a partisan view—Senator Wyden is probably Big Tech’s greatest ally in Congress, and I unequivocally called him a liar with regard to the CASE Act—it’s the view of someone who, like many Americans, is weary of policy discussions in which outright bullshit is given equal weight to evidence-based theory and practice.  And with respect to Reed Hunt’s observation, this was an inevitable consequence of giving every citizen a megaphone; but platform immunity like Section 230 is the reason Zuckerberg will call outright bullshit like Holocaust denial an “opinion.”  

Promoting Progress in the Digital Age

progress

Over the past three years since the internet industry first had to respond to the so-called “Techlash,” various comments on the theme that “the internet didn’t turn out like we expected” have generally shared one common flaw—a failure to acknowledge that the expectation itself was folly.  Whether parties are debating the amount of moderation that should or should not be done by a platform like Facebook; or whether breaking up the internet giants to foster competition would ameliorate the negative effects; or whether curtailing liability shields and treating platforms like publishers would do the trick, the big lightbulb that has not dimmed nearly enough is the original assumption that more people expressing, sharing, posting more stuff could only benefit the world.  All evidence points to the contrary.

When I started this blog in the Summer of 2012, I was partly motivated to advocate artists’ rights (copyrights) against the agenda of Silicon Valley, but I was also skeptical that the underlying assumption justifying the abrogation of those rights—that the information age was fulfilling its promise—was true in any meaningful way.  I asked at the outset whether the internet, as it was shaped since the 90s, was in fact empowering our better angels and ushering in a second Enlightenment grounded in science; or whether it was more effectively aggravating our worst instincts and undermining the pillars of republican democracy.  

In this context, I use the word science in its broadest sense to encompass the principle of a politics rooted in knowledge and reason, and this expansive reading is roughly how we have interpreted Madison and Pinckney’s use of the word science in writing the constitutional clause that gave Congress the authority to adopt copyright law.  This is why the tech-utopian assumption that the internet would bring about the aforementioned second Enlightenment is directly tied to the anti-copyright agenda. 

What authors of works see as the protection of their rights, the digital-age copyright critics characterized as barriers to accessrent-seeking mechanisms, and corporate gatekeeping, all of which results in what they call “artificial scarcity” of expressive and informative works.  Hence the critic’s logic that “free” digital distribution inherently abridges—if it does not simply obliterate—the original purpose of adopting copyright as an incentive to produce and distribute works of science.   

Bizarrely, this utopian narrative persists despite the fact that the United States has now arrived at an existential crossroads.  Mired in what some observers have gravely termed a “cold civil war,” we are officially a nation divided and sub-divided into separate realities; and relatedly, our so-called “age of information” is witnessing an unprecedented volume of brain-drain at the highest levels of government and public service.  While the owners of the major platforms double down on their idealistic talking points, the real world increasingly resembles the worst corners of cyberspace, complete with mob-like assaults on expertise, professionalism, and patriotism for the sake of what can only be described as the cult of Trump.  

In the space of two years, the Republican Party has abandoned its own core principles, sloughing off actual conservatives, and even going so far as to faithlessly attack the characters of career service professionals who have risked their lives for American interests.  And all because they are afraid of being the targets of a presidential tweet.  “We shall nobly save, or meanly lose, the last best hope of earth,” Lincoln wrote to Congress in 1862.  So, is it really conceivable that a century and a half since the Civil War, the party that used to call itself “the party of Lincoln” will allow the Republic to falter because an illiterate mean-girl wearing a tinfoil crown has a Twitter account?  Talk about going out with a whimper.

It is presently unavoidable to blame the GOP for this particular moment of history-altering fecklessness but also worth remembering that thanks in no small part to social media, my friends on the left helped loosen the bolts on many of the same girders this administration is now dismantling.  It may be shocking to watch Members of Congress disrespect public servants like Lt. Col. Vindman, Dr. Hill, or Ambassador Tayor, but it was not very long ago (2014) that, for example, Naval War College professor Tom Nichols, wrote for the decidedly-conservative Federalist, “I fear we are witnessing the ‘death of expertise’: a Google-fueled, Wikipedia-based, blog-sodden collapse of any division between professionals and laymen, students and teachers, knowers and wonderers – in other words, between those of any achievement in an area and those with none at all.”

To a great extent, Nichols seemed to be addressing a progressive-leaning constituency of netizens who, just like many latent Trump supporters, dismissed authority, expertise, and experience as “elitist.”  And they still do.  So let’s not pretend the GOP is alone in amplifying and weaponizing internet conspiracy theories like the “deep state.”  Mainstream media, the intelligence community, the military—even the U.S. Copyright Office!—have all been generically maligned as “the government” by disparate constituencies—as if the government did not already comprise thousands of people just like Vindman, Hill, and Taylor. 

By contrast, all that ebullient swooning a few years ago over data-dumpers like Assange, spraying their cans of sunlight, was naïvely perceived as leaking truth to power.  But what that illusion of access really achieved was an erosion of faith in the same professionals now having their patriotism questioned for political gain.  Likewise, bloviators like Reps. Jordan and Nunes may be the most prominent figures calling the mainstream media “puppets” and “enemies,” but let’s be real: the word mainstream as a pejorative has been used across the political spectrum to justify dismissing any career journalist who reports something that some constituency doesn’t want to hear. 

Suffice to say, the battlefield was well-softened for armies of disinformation trolls to start what former State Department official Richard Stengel calls a full-scale information war we are not winning:  

“Governments, nonstate actors and terrorists are creating their own narratives that have nothing to do with reality,” Stengel writes. “These false narratives undermine our democracy and the ability of free people to make intelligent choices. The disinformationists are aided by the big-platform companies who benefit as much from the sharing of the false as from the true. The bad guys use all the same behavioral and information tools supplied by Facebook, Google and Twitter. Just as Nike buys your information to sell you sneakers, the Russians bought your information to persuade you that America is a mess.”

Having dutifully fulfilled the trolls’ prophecy—because America is certainly a mess now—it is a pretty harsh referendum on the information age to watch the GOP respond to clear evidence that the President of the United States abused his office, asserting a combination of internet conspiracy theory and the eccentric proposal that Trump is too incompetent to break the law (see Sen. Graham comments).  That’s one hell of a rationale to pitch to the American people about their president, but it is astoundingly effective thanks to the “democratization of information.” 

So, no, the second Enlightenment did not happen. Science is now a choose-your-own-adventure game you can play on your mobile device, and the “illusion of agency”* provided by social media is being moderated by some over-caffeinated, professional rat-fucker in St. Petersburg.  All that being the case, perhaps the tech-industry activists who still insist that copyright is a gremlin sabotaging the promise of the internet, might find some better targets for their censure than the authors and artists of the world.


*All credit to Neil Turkewitz for this expression.

Unicorn illustration by julos.