a

Facebook

Twitter

Copyright 2023 Ernest Goodman Law Firm - Los Angeles - New York.
All Rights Reserved.

9:00 AM - 5:00 PM

Our Opening Hours Mon. - Fri.

+1818-858-0406

Call Us.

Facebook

Twitter

Search
Menu
 

Legal Aspects of AI-Created Movies: Right of Publicity

Ethics Before Profits
Law Offices of Ernest Goodman > Entertainment Law  > Legal Aspects of AI-Created Movies: Right of Publicity

Legal Aspects of AI-Created Movies: Right of Publicity

Hello everyone,

Today we will talk about legal implications of films made with the use of AI.

The integration of artificial intelligence (AI) into the filmmaking process is no longer a speculative trend—it is a present reality. AI is now capable of generating scripts, creating synthetic voices and faces, and even rendering entire scenes or performances that resemble real individuals. While these tools offer groundbreaking creative possibilities, they also bring about complex legal questions—especially when it comes to the unauthorized use of a person’s likeness.

This article focuses on one of the most significant legal challenges in AI-generated film content: the right of publicity. This right allows individuals to control and profit from the commercial use of their name, likeness, voice, image, and persona. As AI tools mimic these features with increasing realism, creators may face significant legal exposure across multiple jurisdictions.


1. California: The Nation’s Broadest Protection of Publicity Rights

California law provides the most extensive statutory and common law protections for publicity rights in the United States—making it a central jurisdiction in AI-related disputes.

1.1 Statutory Protection for the Living – Civil Code § 3344

Under Cal. Civ. Code § 3344, it is unlawful to use another person’s name, voice, signature, photograph, or likeness for commercial purposes without their prior consent. The statute applies whether the use is direct (e.g., a photograph) or indirect (e.g., a digitally altered replica resembling the person).

AI-generated characters, avatars, or voices that intentionally resemble real individuals—especially celebrities—can trigger liability under this statute. Penalties include:

– $750 in statutory damages or actual damages (whichever is greater),

– Profits attributable to the unauthorized use,

– Punitive damages and attorney’s fees in some cases.

1.2 Postmortem Protection – Civil Code § 3344.1

California also protects the publicity rights of deceased individuals for 70 years after death. These rights are descendible and enforceable by heirs or rights-holding entities.

For example, if a filmmaker uses AI to create a performance by a deceased actor (such as Bruce Lee or Robin Williams) without estate authorization, that may be a violation of § 3344.1. A public performance or monetization of such a character (e.g., via streaming or merchandise) increases liability.

1.3 Common Law Publicity Right

In addition to the statute, California recognizes a common law right of publicity, which can apply even in cases where a likeness is not directly used. Courts have recognized liability where the totality of a character’s features—voice, appearance, dress, mannerisms—evokes a real person.

📌 Key Case: White v. Samsung Electronics America, Inc., 971 F.2d 1395 (9th Cir. 1992)
The court ruled that a robot dressed like Vanna White could still be considered a misappropriation of her identity.


2. New York: Modernizing the Right of Publicity

Historically, New York was viewed as more restrictive than California. That changed with the 2021 passage of Civil Rights Law § 50-f, which significantly expands protections to include digital replicas and postmortem rights.

2.1 Protection for the Living – Civil Rights Law §§ 50–51

New York continues to protect the unauthorized use of a living person’s name, portrait, picture, or voice for purposes of advertising or trade. Violations can result in:

– Injunctive relief,

– Monetary damages,

– Criminal liability in extreme cases.

Unlike California, New York’s statutory language historically excluded postmortem protection—until § 50-f was enacted.

2.2 Digital Replicas & Deceased Individuals – Civil Rights Law § 50-f

This statute now protects deceased personalities who had commercial value during life. Key features:

– Rights last 40 years after death.

– Unauthorized use of a digital replica (defined to include simulations via computer-generated or AI-based technology) can result in civil penalties.

– Consent is required to depict a deceased performer in a “simulated performance” in an audiovisual work, unless the work is protected by exceptions such as news, satire, or commentary.

📌 Practical Effect: Filmmakers and AI developers must now obtain permission from the estate of any deceased person whose likeness, image, or simulated voice is used commercially in New York.


3. European Union: Data Protection and Personality Rights

The EU does not recognize a standalone “right of publicity.” Instead, protections are derived from data protection law, privacy law, and personality rights guaranteed by national constitutions and civil codes.

3.1 GDPR and Personal Data

The General Data Protection Regulation (GDPR) classifies any data that identifies a person—including biometric and facial data—as personal data.

AI-generated content that imitates or reconstructs a person’s image, voice, or mannerisms may count as data processing, particularly if based on training sets drawn from real images or recordings.

To lawfully process this data, Article 6 of the GDPR requires a legal basis, typically consent. Without consent, the use of an individual’s likeness—especially in a commercial context—could lead to:

– Administrative fines (up to €20 million or 4% of global turnover),

– Civil liability for damages.

3.2 National Laws Protecting Personality

Countries like Germany, France, and Italy go further, recognizing personality rights as part of constitutional privacy or dignity rights.

– Germany: The right to control one’s image is protected under the Kunsturhebergesetz (KUG) and Articles 1 and 2 of the Basic Law (Grundgesetz).

– France: Article 9 of the Civil Code provides for protection against unauthorized use of image and voice.

📌 Example: If an AI model generates a synthetic voice that imitates a French singer and uses it in a commercial song without consent, this could trigger GDPR liability and a civil privacy claim in France.


4. United Kingdom: No Formal Publicity Right, but Creative Remedies Exist

Unlike the U.S. and many EU states, the UK does not recognize a free-standing right of publicity. However, a range of common law and statutory remedies may still provide protection.

4.1 Passing Off

The tort of passing off may apply where an AI-generated image or voice falsely suggests that a celebrity has endorsed or participated in a work. Plaintiffs must show:

– Goodwill in their image or brand,

– Misrepresentation,

– Damage caused by the misrepresentation.

This has been successfully used in cases involving look-alikes or imitators in advertising.

4.2 Misuse of Private Information

Where an AI-generated portrayal intrudes into private life—e.g., showing a person in a fictional sexual scenario or misrepresenting a political stance—UK courts may find a violation of the right to privacy.

4.3 UK GDPR and Biometric Data

The UK version of GDPR, retained post-Brexit, continues to protect biometric and identity-related data. AI-generated representations trained on or mimicking real individuals may be regulated if personal data is involved.


5. When There Is No Publicity Right: Other Legal Theories

Even in jurisdictions without express publicity protections, claimants may pursue alternative causes of action:

– Defamation – if the AI-generated portrayal harms the person’s reputation.

– False Light – in U.S. states where this tort is recognized.

– Unjust Enrichment – if profits are made using another’s identity.

– Copyright Infringement – if a synthetic likeness copies a protectable performance.

– Consumer Protection Claims – especially if the audience is misled about endorsement.


6. Best Practices for Filmmakers and AI Developers

To avoid litigation and regulatory penalties, content creators should:

– Obtain clear, written consent from any identifiable person portrayed using AI.

– Secure estate licensing for deceased personalities.

– Use disclaimers when appropriate, but do not rely on them to avoid liability.

– Consult legal counsel before distributing AI-generated media internationally.

– Avoid creating composite characters that might suggest a real person’s identity.


⚠️ Disclaimer by Ernest Goodman, Esq.

This article is intended for informational purposes only and does not constitute legal advice. Reading or relying on this content does not establish an attorney-client relationship. Because laws differ by jurisdiction and continue to evolve in response to AI technologies, readers are encouraged to consult a qualified attorney licensed in the relevant jurisdiction for advice tailored to specific circumstances.

.

No Comments

Leave a Comment