Photo source by orlaimagine
One of the first articles I ever published (for a magazine that no longer exists) was about cogeneration. This is the process whereby the waste heat produced by a power plant is captured and used to heat the same structures to which it supplies electricity. That was in 1997, just as President Clinton was about to sign the Kyoto Climate Change Protocol to the consternation of most of the GOP. In fact, the Senate refused to ratify the treaty on the usual grounds that emissions restrictions would harm the American economy. This was also nearly a decade before An Inconvenient Truth helped bring the subject of climate change into mainstream consciousness for many Americans.
Based on what I had learned about the efficiencies of cogenerating power plants and other technological solutions to reduce carbon output, I speculated in that article that emissions caps like those called for in Kyoto, while they serve an important purpose, may be surpassed by industry. As soon as enough companies realized that using less energy saves money and that investments in new energy innovation could be profitable in itself, the private sector should, in theory, do better than the regulatory mandates of a negotiated treaty. And to an extent, this is what happened.
Many American corporations have realized the benefits of low-carbon-emission investments, and they’re not going to reverse course just because the current president doesn’t believe in science. Reporting on this very topic has been among the silver-lining responses to Trump’s announcement that the U.S. would naively pull out of the Paris Agreement; but even if the domestic private sector continues to develop energy alternatives in spite of federal policy, one major problem with not having government as a partner in innovation is that the people then have little say about where technology leads or how it’s used. In the energy sector, this may not be a major concern, but when it comes to the effects of our data-driven society overall, that’s another story.
Among the fears that keep me up at night—and there are many of late—is that the current administration will create such a gaping policy vacuum, leaving so many Americans wanting, that we will then react by turning to the technologists to save us. And many of these folks are indeed brilliant in their own ways, but they don’t necessarily get democracy either. Perhaps the murmurings about Mark Zuckerberg running for president are just murmurings. Perhaps the exaggerated proposal in 2014 to anoint Eric Schmidt as “CEO of America” was meant as a provocative joke. But I find it more than plausible to imagine how we might slingshot around the gravity of this black hole administration to accelerate an already-latent desire to breed a technocracy.
After all, as much as I bash Silicon Valley for various reasons (especially how the internet giants have treated creative workers), there are certainly investors and geniuses out there solving tangible problems whether the nation’s political leaders think they should or not. They’re investing in renewable energy solutions and medical research and advancing real innovations beyond the side-show marketing platforms we generically call the internet. It is hard to ignore the Ayn Rand-like contrast between the valley of brilliant minds and the regressive whimsy of Covfefe, which I propose become the official name for the Trump doctrine.
The problem, as we know from the world of science fiction, occurs when the wizards, in the spirit of Rand, assume that we cannot live without them. And then they turn out to be right! When oligarchs own the machine of the world and we destroy the intermediary force of representative government, we get feudalism, albeit in technological form this time.
One of the biggest challenges we currently face is how we are going to address the progress of automation and the probability of a workless future for perhaps as much as 40% of the population within a decade or two. And while we are understandably distracted, either by supporting or denouncing efforts to rekindle “the greatness” of the United States of 1955, the AI challenges—economic, social, and moral—are “not even on our radar screen,” says Treasury Secretary Mnuchin, as reported in this article by Jamie Bartlett for The Guardian.
Theorizing that the current dominance by right-wing populism through data-manipulation (i.e. propaganda) is merely the story of the moment, Bartlett writes, “Digital technology has helped the populist right for now, but it will soon swallow them up, along with many other political movements unable or unwilling to see how the world is changing.” What he’s referring to is crypto-anarchy, an ideology based on the premise that networked technology will obviate the need for governments or states; and this view is only slightly divergent from the brand of libertarianism espoused by many of Silicon Valley’s leading executives. This theme can also be heard in the political views of more than few progressives who seem to feel that government itself is an obsolete construct. Bartlett writes …
“It’s not a direct path, but digital technology tends to empower the individual at the expense of the state. Police forces complain they can’t keep up with new forms of online crime, partly because of the spread of freely available encryption tools. Information of all types – secrets, copyright, creative content, illegal images – is becoming increasingly difficult to contain and control. The rash of ransomware is certainly going to get worse, exposing the fragility of our always connected systems. (It’s easily available to buy on the dark net, a network of hidden websites that are difficult to censor and accessed with an anonymous web browser.) Who knows where this might end.”
That may sound like good news to the anarchic idealist, but there is not a single lesson in history where we find the collapse of government resulting in good times for most people. In fact, the benign anarchists would probably be among the first slaughtered in a world of marauding survivalists reacting to the breakdown of basic systems. Perhaps it wouldn’t go that way, but the proposal that technology alone can sustain billions of people, leaving us all at our leisure to write poetry and share selfies, seems to overlook one or two qualities of human nature and the post-Enlightenment rationale for constructing democratic states.
To me, a crypto-anarchist is a guy who insists on paying for a RedBull with Bitcoin because he has no idea how the RedBull got to the vendor in the first place. No question technology is a major part of that supply chain, but we forget the human element at our peril. As Barlett notes in that article, the efficiencies gained by “Uberizing” multiple sectors of the economy come at the cost of labor rights due to the lack of accountability for the virtual “employer.” And when it comes down to brass tacks, civil rights are profoundly intertwined with labor rights.
One of the dangers of the presently divisive climate, driven by so much false information, is that Americans in particular will forget how fragile the Republic actually is. It’s just words on paper that we try to live up to, and that effort has produced some incredible results—particularly in the arts and sciences. The inherent brittleness of the American contract has historically been mitigated by the sustainability and economic security of a large middle-class. So, if enough things break and our “nation turns its lonely eyes” to Google, what follows is hard to say, but I don’t think it will be democracy.