Cybercrime and Terrorism Sponsored by Your Candidate
If you were watching TV and a show came on called How to Hack Computers and Commit Credit Card Fraud with a lead commercial from Bank of America, you might think there’s something amiss. Like, where does the network get off airing a show specifically teaching people how to commit crimes? And did BofA really mean to be the sponsor? If not, they must be pretty pissed off at the network. And if they did mean to be the sponsor, we consumers should be pretty pissed off at the network and the sponsor, right? That’s how the world of media and advertising works. Except on YouTube.
Digital Citizens Alliance released a new report last month covering a familiar theme with an election-year twist. As the organization has reported in the past, advertisers who spend money to place ads on YouTube are essentially cheated out of some portion of their media buy when their ads appear in conjunction with videos selling or promoting criminal or terrorist activity. I and others have cited examples of mainstream American brands unwittingly sponsoring ISIS recruiting videos or clips teaching people how to deliver malware to steal identities and data. But this new report by DCA called Fear, Loathing, and Jihad calls attention to the fact that all of the current presidential campaigns are in one way or another sponsoring these criminal or terrorist-produced videos. From the report:
“How does the Kasich campaign, whose credibility is based on fiscal aptitude and efficiency, feel about their ads showing up next to a video by those actively committing financial fraud?”
“Support from young voters is the main reason why Senator Bernie Sanders is able to challenge Hillary Clinton. Why would he want a campaign ad showing up next to a video demonstrating how to “slave” the computer of a young male victim?”
Political ads are a variation on the larger theme of poor-quality placement that affects all advertisers in the digital market, but DCA is not wrong to point out the uniqueness of these dichotomous pairings when we see American presidential candidates effectively hosting videos calling for jihad or selling fake IDs and other contraband. Moreover, in several cases the candidate’s ad buy may actually be putting money into the pockets of the criminal video makers. So, it’s not farfetched to say that you can donate twenty bucks to your candidate and that money can end up in the pocket of some homegrown, would-be jihadist by way of Google AdSense and the YouTube Partner program. Unfortunately, it seems that Google is about as diligent in vetting YouTube Partners to participate in ad revenue sharing as it is in mitigating copyright infringement on its platforms.
According to Google’s own Terms and Conditions, a prospective Partner must upload “advertiser friendly content”, and here’s what the company says might be considered unfriendly:
Content includes, but is not limited to:
•Sexually suggestive content, including partial nudity and sexual humor
•Violence, including display of serious injury and events related to violent extremism
•Inappropriate language, including harassment, profanity and vulgar language
•Promotion of drugs and regulated substances, including selling, use and abuse of such items
•Controversial or sensitive subjects and events, including subjects related to war, political conflicts, natural disasters and tragedies, even if graphic imagery is not shown
Now, my own read of those conditions would want to to see them applied with considerable latitude given that plenty of high-quality satire, news reporting, and entertainment is likely to implicate any number of those descriptions. But if Google is not able to, for instance, separate the combat-related humor in videos made by the veterans group Ranger Up and an ISIL recruiting video—or a video made by some jerk showing people how to invade a girl’s privacy through her computer—then maybe those conditions are really not conditions so much as they’re just a bunch of words Google universally ignores.
DCA states that when their reports and the news media have brought attention in the past to this same issue, YouTube has made an effort to remove ads from many offending videos, but the report also implies that this type of action is a band-aid in response to momentary pressure. Just like infringing material is restored as fast as it is taken down, ads continue to be linked to videos that no brand—let alone any political candidate—would choose to sponsor.
Although advertisers do have a measure of control in setting parameters to properly target their ads, the automated nature of the system is nothing like the control advertisers have with traditional media buys. As the report states, “Let’s be clear: Google is not giving advertisers the opportunity to veto undesirable videos, but to opt-in and minimize the possibilities of ads showing up in undesirable places.” As we see in the context of rights holders and the DMCA, Google’s own financial incentive is grounds to play ignorant and incapable and to shift the burden to everyone else. Again, to quote the report, “Right now, the best thing you [campaign operative] can do is report the videos to YouTube, which may pull these videos down. Google has deputized all of us to do the work it can’t…or won’t.”
Speaking of incentive, why the leadership of Google does not display the basic human decency or corporate responsibility to delete these videos as clear abuses of their service is inexplicable beyond basic greed. Because let’s be grown-ups: free speech doesn’t even enter this conversation. Speech does not protect criminal activity, incitement to violence, or training in the commission of crimes; and it sure as hell does not protect the video productions of violent extremists whose agenda fundamentally betrays the natural rights philosophy upon which free speech is predicated. And more prosaically, any private company is within its right to provide or not provide content based on its own internal judgments without violating free speech. But there’s the rub.
It seems that YouTube is in sort of a logical pickle, trapped between its safe harbor status from liabilities like copyright infringement and what could become a growing demand to guarantee quality impressions to the advertisers who pay all of the company’s bills. In order to avoid liability for the millions of user-caused copyright infringements on the platform, YouTube has to maintain that it is blind to the content on its servers prior to a specific notification. Meanwhile, the advertisers (and frankly the public) would be better served if YouTube were to make a serious effort to remove videos that are clearly dedicated to promoting or abetting the commission of crimes and acts of terrorism. But the more YouTube exerts this kind of editorial control, the thinner their veil of ignorance becomes, which can then expose the company to liability for copyright infringement and other abuses of its platform. Meanwhile, as the monopolistic YouTube hovers in this limbo raking in millions, the advertisers, rights holders, and public are not well served.
The DCA report states that this year the presidential campaigns will spend $1 billion in digital advertising, with Google, Facebook, and Twitter receiving most of that revenue. For perspective, the report explains that if Google takes the same percentage of that billion as it made from all digital US advertising in 2015, it will earn $387 million from campaign spending alone. Meanwhile, the company that claims to provide the tools of political transparency to the public is anything but transparent on this matter according to the report. “We have no idea how much Google and YouTube make from videos marketing illegal or illicit activities,” the report states. “Google has fought back against elected officials and regulators who’ve asked questions about the money. So far, the company has been successful at keeping its numbers a secret.” Maybe the point at which political campaign dollars are being split 45/55 between Google and terrorists is the moment when federal regulators decide to get serious.
© 2016, David Newhoff. All rights reserved.Follow IOM on social media: