November 21, 2023
This week, I’m taking a break from talking about court cases and instead focusing on a draft bill aimed at creating a federal right of publicity that was introduced in October by a bipartisan group of Senators. A quick refresher: the right of publicity allows an individual to control the use of their voice, and laws or cases governing this right exist in about two-thirds of the states.
Now, with generative AI and “deepfake” technology, celebrities and entertainment companies are pushing for greater protection against the creation of unauthorized digital replicas of a person’s image, voice, or visual likeness. And the Senate, it appears, is responding, raising concerns among digital rights groups and others about First Amendment rights and limits on creative freedom.
Before diving into the specifics of the bill and its potential implications, I want to step back and talk about the underlying reasons for intellectual property laws. These laws are the subject of entire law school classes (I took several of them), but I can quickly summarize two fundamental reasons why they exist. The first is to encourage artistic works and inventions, an idea that can be found in the U.S. Constitution. The idea is that allowing creators (in the case of copyright law) and inventors (in the case of patent law) to exclusively reap the economic benefits of their work will incentivize people to make art and invent useful things. Notably, both copyrights and patents are in effect for a limited amount of time: for patents, 20 years from the date of the application, while copyrights run for the life of the creator plus 70 years (note that length; it’s going to come up again).
The second reason is to prevent consumer confusion. This is the central concern of trademark and unfair competition laws, which are intended to ensure that no one other than the company associated with a particular good or service is selling that good or service.
The idea behind the right of publicity (you can read more about it in the context of generative AI here), includes a dash of both of these rationales. It ensures that individuals can profit from their investment in their persona by preventing others from using their name, likeness, voice, etc., without their permission. It also prevents brands from claiming someone endorsed a product without that person’s consent.
With generative AI and the ease with which anyone can now create a digital replica of a celebrity to endorse a product or perform a song, artists and entertainment companies are worried that the current patchwork of state laws isn’t enough. Hence, the Nurture Originals, Foster Art, and Keep Entertainment Safe Act of 2023 or the NO FAKES Act of 2023, which, if enacted, would create a federal right of publicity. (A side question: in hiring staff, do Members of Congress test job applicants’ ability to come up with wacky bill titles that can be made into acronyms? Because this one certainly took some legitimate skill.)
The bill protects against the creation of an unauthorized “digital replica,” which the NO FAKES Act describes as: “a newly created, computer-generated, electronic representation of the image, voice, or visual likeness of an individual that is [nearly indistinguishable] from the actual image, voice, or visual likeness of an individual; and is fixed in a sound recording or an audiovisual work in which that individual did not actually perform or appear.”
In other words, NO FAKES bars using a computer to create an audiovisual work or a recording that looks or sounds very much like a real person when that person has not consented. This proposed right bars the creation of a digital replica during a person’s lifetime and for 70 years after death (the same as existing copyright laws). In the case of a dead person, the person or entity that owns the rights to the deceased’s publicity rights (often, the deceased’s heirs) would have to consent to the creation of a digital replica.
If NO FAKES is passed, anyone who creates an unauthorized digital replica can be sued by the person who controls the rights; the rights holder can also sue anyone, like a website or streaming platform, who knowingly publishes, distributes, or transmits a digital replica without consent. This is true even if the work includes a disclaimer stating the work is unauthorized.
That said, the Act as currently drafted does include some exceptions intended to protect the First Amendment. For example, NO FAKES states that it is not a violation of the Act to create a digital replica that is used as part of a news broadcast or documentary or for purposes of “comment criticism, scholarship, satire, or parody.”
Some other things to note:
- The right to control the creation of a digital replica does not extend to images that are unaccompanied by audio.
- The draft bill states that the right to control digital replicas “shall be considered to be a law pertaining to intellectual property for the purposes of section 230(e)(2) of the Communications Act of 1934. This means that Internet service providers cannot rely on Section 230 to avoid liability.
Now, it is likely the draft will have undergone significant amendments and revisions if and when it is passed. As mentioned above, digital rights groups and others worry that the right of publicity can be used to litigate against speech protected by the First Amendment, as public figures in the past have tried when they don’t like something that has been said about them in the media.
To me, the Act seems a bit suspicious. You may notice I’ve stressed how the Act extends protection against digital replicas to 70 years post-mortem, the same exact length as copyright protection. Isn’t this expansiveness a bit much considering the current state of play is no federal right of publicity at all? The extreme length of the proposed protection, coupled with the Act eliminating the use of disclaimers as a shield for liability, suggests NO FAKES is less about protecting the public and more designed to prolong celebrities’ and entertainment companies’ abilities to profit. After all, the right to publicity created in the NO FAKES Act can be sold by an actor or their heirs to a company like, say, a movie studio… that could then, in theory, continue to feature digital replicas of the aged or deceased actor in their films unchallenged for seven decades after death. Thelma and Louise 4: Back From the Abyss is coming, and Brad Pitt won’t look a day over 30.
Good, perhaps, for Brad Pitt. The rest of us, maybe not.
November 7, 2023
On October 30, 2023, a judge in the Northern District of California ruled in one of the first lawsuits between artists and generative AI art platforms for copyright infringement. While the judge quickly dismissed some of the Plaintiffs’ claims, the case is still very much alive as he is allowing them to address some of the problems in their case and file amended complaints.
So what’s it all about? Three artists are suing Stability AI Ltd. and Stability AI, Inc. (collectively, “Stability”), whose platform, Stable Diffusion, generates photorealistic images from text input. To teach Stable Diffusion how to generate images, Stability’s programmers scrape (i.e., take or steal, depending on how charitable you’re feeling) the Internet for billions of existing copyrighted images — among them, allegedly, images created by the Plaintiffs. End users (i.e., people like you and me) can then use Stability’s platform to create images in the style of the artists whose work the AI has been trained.
In addition to Stability, the proposed class action suit on behalf of other artists also names as defendants Midjourney, another art generation AI that incorporates Stable Diffusion, and DeviantArt, Inc., an online community for digital artists, which Stability scraped to train Stable Diffusion, and which also offers a platform called DreamUp that is built on Stable Diffusion.
The Plaintiffs — Sarah Andersen, Kelly McKernan, and Karla Ortiz — allege, among other things, that Defendants infringed on their copyrights, violated the Digital Millennium Copyright Act, and engaged in unfair competition.
In ruling on Defendants’ motion to dismiss, U.S. District Judge William Orrick quickly dismissed the copyright claims brought by McKernan and Ortiz against Stability because they hadn’t registered copyrights in their artworks — oops.
Anderson, however, had registered copyrights. Nonetheless, Stability argued her claim of copyright infringement should be dismissed because she couldn’t point to specific works that Stability used as training images. The Court rejected that argument. It concluded that the fact she could show that some of her registered works were used for training Stable Diffusion was enough at this stage to allege a violation of the copyright act.
The judge, however, dismissed Anderson’s direct infringement claim against DeviantArt and Midjourney. With DeviantArt, he found that Plaintiffs hadn’t alleged that DeviantArt had any affirmative role in copying Anderson’s images. For Midjourney, the judge found that Plaintiffs needed to clarify whether the direct infringement claim was based on Midjourney’s use of Stable Diffusion and/or whether Midjourney independently scraped images from the web and used them to train its product. Judge Orrick is allowing them to amend their complaint to do so.
Because Orrick dismissed the direct infringement claims against DeviantArt and Midjourney, he also dismissed the claims for vicarious infringement against them. (By way of background, vicarious infringement is where a defendant has the “right and ability” to supervise infringing conduct and has a financial interest in that conduct.) Again, however, the Court allowed Plaintiffs to amend their complaint to state claims for direct infringement against DeviantArt and Midjourney, and also to amend their complaint to allege vicarious infringement against Stability for the use of Stable Diffusion by third parties.
Orrick warned the Plaintiffs (and their lawyers) that he would “not be as generous with leave to amend on the next, expected rounds of motions to dismiss and I will expect a greater level of specificity as to each claim alleged and the conduct of each defendant to support each claim.”
Plaintiffs also alleged that Defendants violated their right of publicity, claiming that Defendants used their names to promote their AI products. However, the Court dismissed these claims because the complaint didn’t actually allege that the Defendants advertised their products using Plaintiffs’ names. Again, he allowed the Plaintiffs leave to amend. (The Plaintiffs originally tried to base a right of publicity claim on the fact that Defendants’ platforms allowed users to produce AI-generated works “in the style of” their artistic identities. An interesting idea, but Plaintiffs abandoned it.)
In addition, DeviantArt moved to dismiss Plaintiffs’ right of publicity claim on grounds that DeviantArt’s AI platform generated expressive content. Therefore, according to DeviantArt, the Court needed to balance the Plaintiff’s rights of publicity against DeviantArt’s interest in free expression by considering whether the output was transformative. (Under California law, “transformative use” is a defense to a right of publicity claim.) The Court found that this was an issue that couldn’t be decided on a motion to dismiss and would have to wait.
What are the key takeaways here? For starters, it is fair to say that the judge thought that Plaintiffs’ complaint was not a paragon of clarity. It also seems like the judge thought that Plaintiffs would have a hard time alleging that images created by AI platforms in response to user text input were infringing. However, he seemed to indicate that it was more likely to allow copyright infringement claims based on Stability’s use of images to train Stable Diffusion to proceed.
October 24, 2023
It’s long been known that one of the pitfalls of being in the public eye is you don’t control your own image. Paparazzi can take photos of you that can be published anywhere, with the photographer getting paid, the media outlet generating revenue from ad sales and subscriptions, and the subject themselves neither seeing a dime nor having any control over how they look. That’s because traditionally, photographers have full copyright when they capture an image of a celebrity, particularly in public. Now, a bunch of new lawsuits are taking ownership even further out of celebrity hands, with photographers and their agencies suing stars who dare to post paparazzi photos of themselves on their social media accounts without licensing them first.
There are plenty of celebs under fire at the moment, including LeBron James, Bella Hadid, and Dua Lipa. A few examples: Melrose Place and Real Housewives star Lisa Rinna posted on Instagram photos of herself that were taken by a paparazzo represented by the Backgrid agency; Backgrid is suing Rinna for copyright infringement. Rinna accuses Backgrid of “weaponizing” copyright law, while Backgrid retorts that once one of their paparazzi photos are posted without permission, magazines like People will be less likely to buy it because fans will have already seen it. Another case: model Gigi Hadid, who is being sued for copyright infringement by agency Xclusive-Lee over posting one of its images to Instagram. Hadid’s legal team asserts her post constitutes fair use because Hadid “creatively directed” the photo by choosing her outfit, posing and smiling, thus contributing “many of the elements that the copyright law seeks to protect.” Hadid also cropped the image when she posted it, which she says refocuses the photo on her pose and smile, rather than the photographer’s composition.
Model Emily Ratajowski recently settled a suit brought by a photographer over a photo he took of her walking outside of a flower shop, her face completely obscured by a bouquet she was carrying. Ratajowski posted the photo on an Instagram story with the text “MOOD FOREVER,” intending to convey how she feels like hiding from paparazzi. While the case settled, the judge indicated her text served as a commentary on the celebrity/paparazzi dynamic that may have amounted to transformative use, protecting her from a copyright claim.
This wasn’t Ratajkowski’s first battle with copyright law. She wrote a long essay on how it feels to be unable to control her image after a photographer took hundreds of nude photos of her early in her career, supposedly for a magazine editorial, and later published them as several books and showed them in a gallery exhibit — all without asking her permission or paying her. Ratajowski also had photos she posted to her Instagram account turned into “paintings” by renowned appropriation artist Richard Prince and sold for $80,000 each. She writes, “I have learned that my image, my reflection, is not my own.”
It’s easy to sympathize with the celebrities’ position. While mere mortals often scorn celebrity complaints about their lack of privacy and the invasiveness of paparazzi — “hey, it comes with the territory!” — it seems like adding insult to injury to allow paparazzi to take photos of celebrities against their will and then demand the celebs pay to use the photos themselves.
Also, it’s not hard to see why Ratajkowski or others might feel victimized by someone in a position of relative power profiting from images without sharing those profits. (For what it’s worth, a number of states do have laws against revenge porn, but that’s not what we’re talking about here.)
In that vein, in the wake of #metoo, the celebrities’ position is also appealing because it’s not hard to see it as trying to subvert the male gaze by allowing the (mostly) female celebrity subjects to at least profit from or assert some element of control over the pictures they appear in.
However, from an intellectual property law point of view, this is not how it works.
For starters, copyright law is really clear. The copyright for photos rests with the person who took the photo. Posing for a picture is not subject to copyright protection, and copyright law doesn’t give the subject of a photo rights to the copyright. This is because a copyright comes into existence when it is “fixed,” meaning recorded on a piece of film or a memory card — and those are owned by the photographer, not the subject.
Moreover, copyrights trump any publicity rights that celebrities have. Article 1, section 8, clause 8 of the U.S. Constitution says that Congress has the power to enact laws to “promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.” This is because we as a society benefit from encouraging creators the right to create by allowing them to profit from their work. Celebrities and their lawyers would say that they too should be able to profit because they provided a service by appearing in the photograph and/or by being famous, and thus photoworthy. While the law isn’t supposed to get into judging the relative value of different artistic contributions, let’s be real: there is a difference between the creation of even a bad novel or artwork and smiling for a second into a camera lens on a step-and-repeat.
What’s more, in contrast to copyright law, the right of publicity is — at least for now — a product of state law. This means that under established law, if there’s a conflict between the rights of a copyright holder and the rights of a celebrity to control his or her image under the applicable right of publicity, the copyright holder’s interests come first.
This isn’t to say that this is the only policy balance that could be struck between the rights of the copyright holder and the rights of the subject of a photo, but it’s the one, for better or worse, that we currently have. So yes, the law is clear: if you’re a celeb, not only do you not profit from photos taken of you in public, if you want to use them yourself, you have to pay.
Also, look at it this way: none of us own everything about ourselves anymore (think about your personal data), nor do we profit from it. There’s no reason for the famous and the sort-of-famous to be different from everyone else.