A class-action suit was filed last week by voice actors Paul Lehrman and Linnea Sage against AI developer LOVO, Inc. According to the complaint, LOVO induced the actors to provide recorded material under false pretenses—material which was then used to produce synthetic replicas of their voices to become part of a catalog offered to paying customers. The complaint also alleges that LOVO defrauds its customers who believe they are using voices that have been legally obtained.
Both Lehrman and Sage were contacted via the freelance hub FIVVER and solicited for voice work. Both asked about the ultimate uses of the recordings—a standard question which affects the price an actor will charge—and both were lied to, according to the complaint. Lehrman was told that the recordings would be used exclusively for academic research, and Sage was told that hers would be used to produce “test scripts” for radio spots. In fact, both anonymized parties who contacted the actors were employees of LOVO—co-founder Tom Lee allegedly contacted Sage—and the sound recordings were used to train the company’s AI to create replicas of Lehrman and Sage’s voices.
Not only were the actors’ voices added to LOVO’s catalog, but the company also used the replicas for its own marketing and capital-raising purposes, renaming Lerman and Sage as “Kyle Snow” and “Sally Coleman” respectively. For instance, the complaint alleges that Sage’s voice was used in demonstrations to raise millions of dollars in venture capital, and Lehrman discovered LOVO promoting the narration attributes of “Kyle Snow” in its own article entitled “The 5 Best Male Voices for Text to Speech.”
The causes of action in the complaint include violations of New York’s right of publicity (ROP) law, deceptive practices, false advertising, violations of the Lanham Act, unjust enrichment, and fraud. Further, the complaint not only alleges harm to the actors but to LOVO’s customers who are misled into believing that the voices being used in their commercial projects have been licensed or otherwise obtained by permission. Plaintiffs seek millions in damages and legal fees.
Before saying anything else, speaking as a guy who is not very spiritual by nature, LOVO has engaged in the practice of soul stealing. Specifically, for Paul Lehrman to read a description of his performance attributes, having had that talent literally stolen and bottled for sale, is a chilling thought. “With his upbeat tone and slightly faster talking speed, Kyle Snow has the perfect voice for conveying enthusiasm and youthfulness,” the description begins. The prospects of AI replacement in the workforce are problematic enough, but imagine reading your own resume and discovering that it’s actually promoting an AI replica of you made without your permission.
The proceedings of this case may prove instructive to many parties with an interest in public policy related to artificial intelligence. The deceptions, if proven, should be damning to LOVO itself, but this case entails considerations that will be worth watching, even where AI development is conducted without lying to obtain training material. Specifically, many interests are looking at state ROP law as a basis for expanding related protections for all individuals. This complaint cites NYS civil rights law, which “imposes liability on a party for misappropriating an actor’s voice ‘for advertising purposes or for the purposes of trade without … written consent[.]’”
The use of a performer’s likeness for advertising purposes is central to many ROP laws in the states that have such statutes, but this case implies taking a more universal approach to proscribing the reproduction of anyone’s likeness for almost any purpose without permission. In fact, the sci-fi thriller quality of transferring not just the technical sound of a voice, but the personality of that voice, reaches beyond the concept of “likeness” as applied to date. It adds a new layer of meaning to the crime of “identity theft.”
Creative work is always a combination of natural talent and hard work to develop certain skills, but one need not be an actor, or any kind of artist, for the same principles to apply. If identity comprises thought, emotion, likeness, and movement, which of these attributes must be copied—and with what degree of precision—before “soul stealing” occurs? I don’t know the answer to that, but I will very curious to see what precedents are set by a case like this one.
Leave a Reply