HB Ad Slot
HB Mobile Ad Slot
Celebrity “Faces Off” Against Deep Fake AI App Over Right of Publicity
Thursday, April 13, 2023

Generative AI (GAI) applications have raised numerous copyright issues. These issues include whether the training of GAI models constitute infringement or is permitted under fair use, who is liable if the output infringes (the tool provider or user) and whether the output is copyrightable. These are not the only legal issues that can arise. Another GAI issue that has arisen with various applications involves the right of publicity. A recently filed class action provides one example.

Kyland Young filed a class action lawsuit against NeoCortext, Inc. (“NeoCortext”) for commercially exploiting his and thousands of other actors, musicians, athletes, celebrities, and other well-known individuals’ names, voices, photographs, or likenesses to sell paid subscriptions to its smartphone application, Reface, without their permission. Reface is a deep-fake software that allows users to swap their faces with individuals they admire or desire in scenes from popular shows, movies, and other viral short-form internet media.

The complaint alleges violations of the class members’ rights under the California Right of Publicity statute, which states that, “[a]ny person who knowingly uses another’s name, voice, signature, photograph, or likeness, in any manner … for purposes of advertising or selling, or soliciting purchases of … services, without such person’s prior consent … shall be liable for any damages sustained by the person or persons injured as a result thereof.” Cal. Civ. Code § 3344(a).

GAI is a powerful tool and has many applications. Many uses will be fine, but many will cross a legal line. Some will do so based on intentional design. Others will do so inadvertently. The Reface app appears to consciously use celebrity images. With some other GAI apps, the models are trained on huge quantities of images, including some images of celebrities. This leads to the possibility that some user prompts may cause the output to include the name, image or likeness (NIL) of a celebrity. This may be the case, even if the GAI tool is not specifically designed to output celebrity images.

Responsible companies are taking proactive steps to minimize the likelihood that their GAI tools inadvertently violate the right of publicity. Some examples of these steps include attempting to filter out celebrity images from those used to train the GAI models and filtering prompts to prevent users from requesting outputs that are directed to celebrity based NIL.

HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 

NLR Logo

We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins