Once again, the question arises whether there is any hope of addressing mass online copyright infringement on otherwise legal platforms? It’s an exhausting problem, more than two decades old, and it isn’t getting better. A recent article by Annie Levin for Observer describes a new campaign by Music Workers Alliance (MWA), in which she sums up the heart of the problem thus:
Because of sites like YouTube, where all music can be accessed for free, streaming services like Spotify can get away with paying musicians a starvation average wage of $0.0038 per stream. A musician must have their song streamed almost half a million times a month to make minimum wage. Far from making a profit, musicians often end up in debt after making an album.
As the article describes, MWA is focused on lobbying for changes to Section 512 of the DMCA, which shields online platforms from liability for copyright infringement committed by their users. But with regard to major platforms like YouTube, Meta’s properties, and Twitter, it is hard to wonder if we are not past the point of seeking meaningful amendment of the DMCA to better support smaller creators. As noted in my recent post during Fair Use Week, there is significant evidence suggesting that the major platforms are not in compliance with DMCA §512 anyway.
If that is the case, what can legislative reform accomplish? §512 is a voluntary provision under which compliance provides a “safe harbor” liability shield against litigation. In order to prove non-compliance, some entity would have to sue, for instance, YouTube, and spend the next decade or more in a fight with Google’s tobacco-industry scale legal counsel. And ain’t nobody got the resources for that. Thus, if the actual litigation shield today is that YouTube et al are simply too big to sue, then §512 is little more than enforcement theater, and amending it could be little more than legislative theater—at least as a means to address chronic infringement on the biggest platforms.
Then There’s the Standard Technical Measures Debate
On a related topic, the U.S. Copyright Office, in late December, sought public comments regarding the agency’s future consultation for the development of more robust Standard Technical Measures (STM) to identify and mitigate infringing uses of works on legal platforms. Under DMCA §512(i), Congress intended that OSPs and copyright owners would collaborate to develop STM, and at the time the DMCA was written, it was the OSPs who promoted the efficacy of STM as a basis for establishing the “safe harbor” provisions in the first place. But as all creators are painfully aware, in the 24 years since the law was passed, collaboration to achieve the Standard in STM has never happened.
Instead, a hodgepodge of technical measures are used, several of these developed by the OSPs themselves to address infringement only insofar as it serves the platform’s interests. For instance, as the members of MWA are well aware, YouTube’s ContentID system is provided to major labels and sound recordings made by megastars but is not available to many thousands of other owners of music rights—even works produced by fairly well-known composers and songwriters. As the STM comments submitted to the USCO by Copyright Alliance state:
… these technologies are usually not voluntarily made available to all types of copyright owners and OSPs have refused to come to the table with other stakeholders to have them formally adopted as widely recognized standards under section 512(i). This has led to a lack of uniformity among and access to existing technical measures that makes it difficult for those copyright owners who do not have access to combat infringement. On the other hand, OSPs prefer the status quo because it allows them to avoid adopting and implementing standard technologies.
Copyright owners would like to see the Copyright Office play a more integrated, regulatory role to achieve standardization of technical measures to better protect the works of a much broader spectrum of copyright owners. Meanwhile, the Electronic Frontier Foundation and similar organizations cite flaws in existing technical measures as a basis to argue against any expansion of these technologies. “Despite years of financial and technical investment, filtering technologies continue to do a poor job of sorting legal expression from infringement,” state the comments submitted by EFF to the Copyright Office. More specifically, the comments state the following:
The core problem is this: distinguishing lawful from unlawful uses often requires context. For example, the “amount and substantiality” factor in fair use analysis depends on the purpose of the use. So while the use may be a few seconds, as for some kind of music criticism, it can also be the whole piece, such as in a music parody. Humans can flag these differences, automated systems cannot.
There is a measure of truth in this refrain, which is played every time the topic of STM is on the table. But its relevance should be considered in terms of evidence rather than theoretical debates. At internet scale, with infringements occurring in the tens of millions every month, and Big Tech still saying, “We can’t police it all,” I think the fair question asks why creators like those in MWA should continue to bear the cost of allowing the perfect to be the enemy of the good?
If the argument is that the good (i.e. better STM) can never be achieved, then the EFF and their fellowship should be required to cite more than anecdotal evidence of potentially harmful error. Because after twenty years, the system in place has a 100% failure rate for many stakeholders on the enforcement side of the copyright equation. That we should continue to allow smaller creators to watch their careers dissolve simply because STM will produce some error is an immoral argument for the status quo. And when that argument is paired with the fact that the major OSPs are now barely incentivized by the DMCA liability shield, the resulting “wage theft” described in Levin’s article is downright criminal.
There is a solution previously thought impossible .. at least technologically. What is every video, song (in fact, any digital content) accessed through YouTube was “un-downloadable” and “un-shareable”? This is doable when very item of digital content is stored and protected on its own blockchain. Accessing content on YouTube means that YouTube is giving the viewer a unique and temporary key that allows access to the requested content, but does not enable downloading and sharing. Sounds a little out there but it is real .. and coming very soon.
The suggestion by Andrew, about using blockchain technology to prevent downloading and sharing, is interesting. Or any technology! Visual artists have the same problem, especially those on Print On Demand sites. PODs like redbubble, zazzle, etc, are constantly scraped by thieves, and the images slapped up for sale all over. Sending DMCA takedowns in that volume is impractical and burdensome. Many of the scraper sites are just using the images as a lure, to infect viewers’ computers/devices with malware, or do phishing scams. Social media is loaded with accounts selling visual arts as quilts, diamond pattern reproductions, paint by numbers, etc, some poorly copied products, others outright scams. Making images unscrape-able would also be needed.