No FAKES Act Introduced:  A Big Deal for Performing Artists and Everyone Else

No FAKES

Ever since the generative artificial intelligence (GAI) controversy began heating up, I’ve had several conversations with friends and colleagues who are voice actors and have had to disappoint them by repeating the fact that copyright law does not protect a person’s “likeness,” which includes one’s voice. And I’ve had similar conversations with colleagues focused on replication of likeness for the production of nonconsensual pornography. Nevertheless, the instinct makes sense—that the same human-centric principles that protect “authorship” might apply to the human’s likeness as well. Now, that basic sense of justice is articulated in a new bill introduced in the Senate.

Historically, the protection of likeness has been the subject of a relatively narrow area of law called the right of publicity (ROP), a common-law right with statutory provisions in 25 states—and narrow because ROP typically applies to the unauthorized use of celebrity likeness for commercial advertising purposes. But with the introduction of the No FAKES Act, Congress proposes to substantially change the protection of individual likeness in direct response to the capacity of GAI to conjure just about anything from fake news to fake performances by actors and musicians.

Introduced by Senators Chris Coons (D-DE), Marsha Blackburn (R-TN), Amy Klobuchar (D-MN), and Thom Tillis (R-NC), the acronym stands for Nurture Originals, Foster Art, and Keep Entertainment Safe Act. The heart of the bill establishes a property right in the likeness of any person, living or dead, and prohibits digital replication without permission. Similar to copyright rights, the “digital replication right” is vested in every individual regardless of whether one commercially exploits one’s own likeness, and the right is licensable and transferrable to heirs and assigns after death. Post-mortem rights would last 10 years but may be extended through a renewal and registration process administered by the U.S. Copyright Office if the right holder can show active and authorized public use of the voice or visual likeness.

The bill anticipates legitimate creative and newsworthy uses of unlicensed replication and exempts a broad range of uses for purposes like news, documentary, parody, etc. For a purpose to be “newsworthy,” the replicated individual must be the subject of the material created—e.g., a story about Hugh Jackman, not merely a replication of him “cast” for free in your film or commercial. Further, the bill explicitly states that creating a false impression that a given replication is an “authentic” recording of the individual will still trigger liability under the new law. Thus, the documentarian who uses a replication in a scene that looks like real surveillance or cellphone footage will probably need to identify that material as AI generated to avoid liability.

Remedies for violation of the digital replication right range from damage awards of $5,000 per depiction made by individuals or by online providers; and $25,000 per depiction by corporate entities other than online providers. Plaintiffs may also seek actual damages and attorney fees, and courts may award punitive damages where unlawful replications entail malice, fraud, or willful ignorance that the use violated the law.

Finally, taking a page from the Copyright Act, No FAKES contains a DMCA-like takedown provision for removal of content alleged to be an unlawful replication, and this provision includes maintenance by the Copyright Office of a database of “agents” to whom such complaints must be submitted. Likewise, familiar safe harbor provisions apply to both product developers and platforms that may, without the knowledge of these providers, be used to produce or distribute unlicensed replications.

Given Silicon Valley’s poor record for compliance with the DMCA for copyright owners, the takedown provisions in No FAKES naturally raises questions about everyday removal of material, which is often the first, if not the main, remedy non-performers will care about. Regardless, from my perspective, the bill both recognizes a wide range of abuses of GAI replication and exempts or limits liability for an appropriate range of legitimate, First Amendment protected uses of the technology.

More than a good start, No FAKES appears to draw from many lessons learned over the past 20+ years pitting human and creative rights against the predatory “progress” of Big Tech. I join the Human Artistry Campaign in endorsing this bill and encourage the full Senate to pass it as soon as possible.


Image source

David Newhoff
David is an author, communications professional, and copyright advocate. After more than 20 years providing creative services and consulting in corporate communications, he shifted his attention to law and policy, beginning with advocacy of copyright and the value of creative professionals to America’s economy, core principles, and culture.

Enjoy this blog? Please spread the word :)