“I don’t know how that bong got in my sock drawer! And what’s a bong anyway?”
When was the last time you used the I Didn’t Know defense; or if you’re a parent, when was the last time you were confronted with the I Didn’t Know defense? Did it work? Not so much, right? But it often feels lately as though the soul of the IDK defense is gaining social clout as some of the darker realities of Web 2.0 collide with some perfectly good laws written in the era of Web 1.0. As such, does the technology we expect to provide transparency simultaneously diminish the value of accountability?
Provisions in U.S. laws such as the Digital Millennium Copyright Act of 1998 and the Communications Decency Act of 1996, provide important safe harbors that protect site owners against civil litigation* for actions performed by third parties while using their sites. But when a site is a large enterprise with millions of pages and tens of millions of users around the world, and if it is ad-supported so that all traffic has a profit motive, the safe harbor thing becomes a rather complicated point of contention between site owners and any party harmed by activity on a site. The scale of the site, the number of users, the volume of UGC (user generated content), and the safe harbor provisions all give a site owner considerable leverage when applying the IDK defense, even in instances where we might reasonably intuit that the owner knows precisely what’s happening on his site and more or less what certain activity is worth to him financially. While safe harbors are vital protections — I wouldn’t want to publish without them — the practical reality is that an owner can theoretically have it both ways; he can profit from illicit traffic by doing nothing to stop it even if he knows it’s there, and also claim to be “shocked gambling is going on in his establishment” if he is named in a suit or indictment.
The fundamental argument of ignorance is consistently the basis of defense for site owners ranging from piracy sites to Google to salacious gossip sites to the now-busted dark web drug-trafficking site Silk Road, whose alleged operator is on trial in New York. Prosecutors are charging founder Ross Ulbricht with being both mastermind and manager of a $1.2 billion criminal enterprise, and Ulbricht’s primary defenses appear to rest on I didn’t know and It wasn’t me, meaning that he denies being the man behind the avatar Dread Pirate Roberts, known to be the operator of the site. I won’t presume to comment on the particulars of the case other than to guess that with an alleged $80 million in Ulbricht’s account at the time of his arrest, if he’s not the guy, he’s probably got some ‘splainin to do.
But it was actually the comments I read below one story about the Silk Road trial that spawned this post. Unconcerned with the gravity of unchecked criminal activity occurring on that anonymous marketplace — activity that reportedly included murder-for-hire schemes — the comment that got my attention was one which stated that the only reason Silk Road was shut down and Ulbricht indicted is that the government didn’t like that it was a market over which it had no control. Maybe this is the rant of one naive kid who represents a tiny population of naive kids, but I have to wonder because the spirit of that comment is just a slight variation on the themes of the Web as Wild West or the entrepreneurial zeitgeist of the industry with its imperative disrupt everything. It all smacks just a bit of embracing anarchy, which is often confused with freedom, though is in fact the fertile ground of feudalism or totalitarianism.
“The mob is the mother of tyrants.” – Diogenes
Again, I strongly believe safe harbors are essential, and I am not qualified to suggest any workable revisions, but I do think the larger notion of accountability may be losing value the more we live in a hybrid society between the real and the digital. As we connect to one another, we also diffuse responsibility for certain actions, which has the opposite intended effect of building those connections because that diffusion actually places greater distance between our actions and those who may be harmed by them. Because the Web spreads responsibility for actions across thousands or millions of people, this often feeds the exact opposite types of behaviors that are presumed will manifest in a self-governed environment.
We saw this with Reddit’s initial refusal and then reluctant agreement to shut down threads devoted to exploiting leaked nude photos of celebrities. The apparent logic among Reddit’s management and users was that because they did not steal the photos and the photos are now “out there,” nobody is really responsible for their distribution or exploitation; and since the images are tied to a news story, the photos are kinda like freedom of the press, right? In a non-web context, like TV news, we’d probably say that showing the photos would be a non-journalistic exploitation of these individuals in order to drive ratings. I’m not saying TV never does this, only that we seem to recognize it for what it is via that medium. Yet, when the medium is a web platform like the boards of Reddit, and the editorial decision to “broadcast” certain material is crowdsourced, somehow the moral assessment of that exploitative decision is skewed because the mob is now responsible, which means nobody is responsible.
And one question we should ask is what happens when illegal or harmful activities become more automated, when accountability is even further removed from individuals to whom laws and judgments may apply? Think about it. Illegal or tortious actions can be committed by bots the same way junk email is delivered. Or maybe you go on vacation for a few days only to find out that your “smart devices” ordered up a few bottles of oxycontin, some assault rifles, and a fake ID. Sound absurd? Maybe, but . . .
Check out this story about a pair of Swiss programmers who created an art installation called The Darknet: From Memes to Onionland, which offered a display of items that were purchased by a bot shopping autonomously on a dark web marketplace akin to Silk Road. The concept was to give the bot a weekly allowance, see what it purchased of its own accord, and then display the items in a gallery setting. Interpret the statement as you will. I would personally defend the actions of these programmers as an artistic expression, although the article cited does raise interesting questions as to who might be responsible for the illegal items such as ecstasy and a counterfeit passport that were purchased by the bot. In fact, the artist/programmers stated that they take full responsibility for this contraband, which is refreshing, and I certainly don’t think they should face any criminal penalties for possession. But their experiment suggests to me that even before we answer some of the tricky challenges posed by safe harbor provisions, the IDK defense is about to gain a new phrase: “Wasn’t me. The bot did it!”
*Changed from original publication, which erroneously referred to liability and criminal activity. Thanks to a friend for correcting the mistake.
© 2015, David Newhoff. All rights reserved.