NYT tech editor Jeong sticking copyright criticism where it doesn’t belong.

Holy whiplash segues, Batman.  There I was reading a perfectly interesting article by Sarah Jeong on the potential hazards of selling one’s personal data, when she took an incomprehensible—if mercifully brief—detour into the realm of copyright law.  She presents a reasonable enough case that the companies now offering to help us “broker” our private data (e.g. health information) may be counting on the fact that, “There’s no legal property right to personal data.  Once personal data is gathered, it’s out there for anyone to buy and sell. At the moment, there are no legal grounds to demand compensation for use,” Jeong writes.

Fair enough.  It is certainly true that the whole prospect of selling private data, even if it were a good idea, does implicate a relatively novel legal framework.  And while I am personally inclined to agree with Jeong that the whole notion is fraught with hazards, I am at a loss to understand where she is going with this interjection …

“In any case, we already know what happens when property rights get slapped on information, because we’ve already done it, to some degree, in copyright law. 

Giving people ownership of their creative expressions means they can buy and sell them on the open market. The risk is that an artist will wind up, like Taylor Swift, alienated from her own work because she no longer possesses the masters of some of her earlier recordings.”

Swift in late June stated publicly that she was very disappointed to learn that mega-star manager Scooter Braun will be acquiring Big Machine Label Group, which still owns her master recordings dating back to the start of her career.  Swift calls the prospect of being under contract to Braun her “worst nightmare,” and for the sake of this post, we will take her word that he is an “incessant manipulative bully” because digging into that backstory could not matter less to Jeong’s ham-fisted allusion to the supposed problem with copyright.  

Even more bizarrely, Jeong happened to pick an artist who has adamantly defended both her own rights and those of much smaller artists, and who told Rolling Stone in 2014, “Important, rare things are valuable. Valuable things should be paid for. It’s my opinion that music should not be free, and my prediction is that individual artists and their labels will someday decide what an album’s price point is. I hope they don’t underestimate themselves or undervalue their art.”  So, I’m just spitballing here, but maybe Swift did not recently do an about-face on the purpose of copyright, or even abandon all prospect of working with labels, so much as she was just saying she really does not like Scooter Braun.  

Turning to Jeong’s implications about the nature of copyright, it is clear that she should refrain from the topic altogether.  For one thing copyright does not “slap property rights onto information.”  Quite the contrary.  There is in fact a long history of statutory development and caselaw that makes it very clear that information is not the subject of copyright.  Expression is the subject of copyright, but the way Jeong slaps these two sentences together makes it seem as though information and expression are the same thing—especially in the context of an editorial that is all about data, which has no resemblance to expression.  

At that point, I guess what Jeong is trying to say is that if we can own and sell our data, then, like Taylor Swift and her masters, we could wind up very unhappy about the party that buys the data.  I think that disappointment is almost a guarantee and that we should be shoring up statutes against privacy-invasion rather than looking for ways to market our DNA profiles and whatnot.  But, that said, what in blazes does the unprecedented challenge of mass data collection and its privacy implications have to do with about three centuries (though I would argue more) constructing a legal framework for authorial rights?  Not a damn thing.

Interestingly enough, the paper written by Samuel Warren and Louis Brandeis in 1890, which is widely considered the seminal American work articulating a right of privacy, actually turns to copyright law as starting point.  Because there is no constitutional declaration of a right to privacy Warren and Brandeis begin with the already long pedigree of copyright in unpublished works when they write, “From corporeal property arose the incorporeal rights issuing out of it; and then there opened the wide realm of intangible property, in the products and processes of the mind.”  

Not only do most people, and certainly most creators, still feel that the products of the mind are a form of personal property, but this was the exact point of reference chosen by a pair of legal lions to make the case that a right of privacy actually exists.  Consequently, Jeong might want to consider the possibility that copyright law provides guidance for the protection of our personal data rather than a warning of what can happen if we become the “owners” of that data.  Or, if we’re looking for warning signs in historic property rights regimes, my friend Neil Turkewitz observes

“If property rights are the model, then Silicon Valley’s dismal track record on intellectual property rights is a giant red flag that simply vesting property rights is of little consequence to the extent that such property rights are essentially unenforceable — particularly for individuals. Since the dawn of the internet, notwithstanding their legal rights, creators and innovators have had to endure an avalanche of illegally available copies of their works online.”

So, maybe, as Warren and Brandeis noted, copyright does have something teach us about privacy that is quite different from Jeong’s misguided assumptions. But what do I know?  I’m just spitballing.

Enjoy this blog? Please spread the word :)