In a recent post on Techdirt, Mike Masnick calls columnist Nicholas Kristof a hypocrite based on a narrative Masnick just plain made up. On December 12, Kristof published a brief column in The New York Times with a picture of a 12-year-old girl who is starving to death as a victim of the US-backed, Saudi-Arabian war in Yemen. The girl is naked but for a diaper and a bandage on her foot, and the image of her skeletal, wasting body is truly humbling, which is why Kristof says he devoted so much of the page to the image itself.
After the story was posted on Facebook, the social platform apparently kept deleting the photograph, which prompted the following tweet from Kristof on December 16:
Facebook seems to have repeatedly blocked the photo of Abrar that went with my column: Come on, Facebook! If you want to end these horrifying images of starving children in Yemen, then help end the U.S.-backed Saudi war that causes the starvation.
Kristof’s complaint was then seized upon by Masnick, who concocted a typically sarcastic “gotcha” on the premise that because Kristof backed the counter-sex-trafficking legislation known as FOSTA (Fight Online Sex Trafficking Act), he has no right to “whine” about Facebook removing this photo for its “sexual content.” True to form, Masnick smugly alleges that Kristof knew nothing about how FOSTA worked, despite the fact that Masnick grotesquely misrepresents the law, as well as the nature of Facebook moderation, in his post.
Without even getting into FOSTA, anyone who has been on Facebook for the past decade or so knows that the platform has often removed images—even fine art—that some moderator believed violated its “community standards.” Facebook has been making these, often laughable, mistakes since long before anyone introduced the legislation that became FOSTA and which passed into law in April of this year. In fact, Masnick’s recent post cites one of his other posts from 2016 criticizing Facebook for censoring the iconic, Pulitzer Prize winning photo of the naked Vietnamese girl running from a napalm strike.
Notably, the removal of that famous photograph was actually mentioned in the documentary The Cleaners, which I wrote about in November, and which profiles the Philippines-based moderators to whom Facebook has outsourced most, if not all, of its “community standards” oversight. The documentary reveals a melange of human fallibility in the decision-making behind content moderation, and Kristof’s photo might have been repeatedly removed for being “disturbing” rather than “sexual.”
Regardless, the broader point is that millions of images a day are processed by these young moderators—and they are required to meet quotas—whose culture is not grounded in American principles of speech, press, etc., and it is almost impossible to generalize about their motivations and judgment calls.
At the same time, even if, in the most depraved imagination, someone could identify Kristof’s photo of this poor child as “sexual,” then it would simply violate child pornography laws, which predate FOSTA, predate Facebook, and even predate the birth of Mark Zuckerberg. Yet, somehow The New York Times published the image, which nobody seems to have confused with pornographic exploitation. All of which is to say that neither the Facebook moderation regime nor Kristof’s specific complaint about the photo, which no sane person could confuse as “sexual,” has anything to do with FOSTA.
As explained in several posts, what FOSTA does is affirm that no internet service provider is automatically immunized against criminal or civil allegations of contributing to sex-trafficking. FOSTA does not mean that a plaintiff who brings a claim has any less burden to prove a platform’s culpability in that crime. (Y’know, the way the law works.) In fact, all one needs to do is look at the volume and nature of the evidence gathered against Backpage to see that proving a contributory role in sex-trafficking takes a hell of a lot more than hosting some “nudity.”
While it is possible that, in an abundance of caution after passage of FOSTA, attorneys at Facebook recommend simply removing anything that can even remotely be deemed “sexual,” it is also evident that the platform was generally doing this long before FOSTA. Next, the platform will, and should, remove material that is patently child pornography. And finally, the attorneys at Facebook are well aware that hosting content which may be used as evidence of “contributing to sex-trafficking” is a distinct and high bar for a would-be plaintiff to meet.
So, it is a leap and a half to allege that platforms are now over-censoring as a result of FOSTA, to say nothing of the current reality that Facebook has way bigger content moderation problems right now. In this regard, I think the folks at Techdirt, and everyone else, ought to be more concerned that Facebook cannot seem to distinguish between a third-party like The New York Times and just some other account holder.
It ought to be a simple enough, internal practice to determine that if a mainstream news company—which is also not immunized against allegations of illegal conduct—can publish an image without legal jeopardy, then Facebook can safely host the same image. Why this does not appear to be the case has everything to do with the platform’s overall management and nothing to do with FOSTA.
I’ll leave it to the judgment of the reader to consider Masnick’s labeling Kristof as having a “savior complex” for his interest in starving children and trafficking victims. But given the choice between a guy who wants to save kids and a guy who wants to save legal liability shields for mega-corporations, well, let’s just say Mike may not make the Nice list this Christmas.
AS soon as I read the first part of you article I thought of the breastfeeding women, and the photo of Phan Thi Kim Phuc running away from the napalmed village., along with others like the image of the sick child and father in the shower. That facebook deleted them all prior to FOSTA is indicative of facebook and other social media companies that delete stuff without any thought of context, mostly this is done to save money as they can employ people cheaply given basic rules, and also to appease the mob that their systems facilitate.
Actually one wonders why the scumball and Google shill Masnick hasn’t whined about Apple forcing Tumblr to delete all the nudity images by disappling Tumblr from the Apple App store. Then again Tumblr was/is as much of “a hive of scum and villainy.” as Backpages was. Indeed the two sites were used in tandem with Tumblr photos linking to trafficking and exploitation on backpages.