Introduction
AI makes it trivially easy to clone a celebrity's voice. But easy does not mean legal. Celebrity voice cloning sits at the intersection of right of publicity, copyright law, fair use, and emerging AI regulation.
This guide explains the legal reality for creators considering using celebrity-like AI voices.
Right of Publicity
Right of publicity protects a person's identity — including their voice — from unauthorized commercial use. In the US:
- California: Strong protections. Using a recognizable voice without consent = liability. (The Bette Midler v. Ford Motor case established this in 1988.)
- Tennessee (ELVIS Act): Specifically extends right of publicity to AI-generated voice replicas.
- New York: Protects voice under Civil Rights Law.
Key principle: if a reasonable person could identify the celebrity from the AI voice, you are likely infringing their right of publicity.
The Scarlett Johansson / OpenAI Case
In 2024, Scarlett Johansson publicly accused OpenAI of using a voice that sounded suspiciously similar to hers for ChatGPT's "Sky" voice. OpenAI denied cloning her voice but pulled the voice from the product.
This case highlighted:
- Even a similar-sounding (not cloned) voice can trigger legal action
- Celebrity voice rights are taken seriously by major companies
- The standard is "confusingly similar," not just "exact clone"
Fair Use Defense
Fair use may protect some uses of celebrity voice cloning:
Likely protected:
- Clearly labeled parody and satire
- Commentary and criticism
- Educational analysis of the technology
- News reporting
Likely NOT protected:
- Commercial content (ads, sponsored posts)
- Entertainment content presented as if the celebrity is involved
- Products or services implying celebrity endorsement
- Content that could mislead viewers about celebrity participation
Fair use is decided case-by-case. There is no bright line.
What Creators Should Do
Do not: Clone a celebrity's voice for commercial content without their (or their agent's) explicit permission.
Do not: Create content that implies a celebrity endorses your product or message.
Consider: Creating "inspired by" voices that capture a style without being identifiable as a specific person.
If in doubt: Consult an entertainment lawyer. The potential damages (right of publicity violations can reach millions) make legal consultation a smart investment.
Frequently Asked Questions
Can I make AI covers of celebrity songs?
The voice conversion is risky (right of publicity). The song itself requires a mechanical license (copyright). The combination puts you at double legal risk. Many AI cover videos exist on YouTube but face potential takedown at any time.
What about deceased celebrities?
Right of publicity extends beyond death in many states (California: 70 years, Tennessee: 10 years). Estates actively protect deceased celebrities' likenesses. Using Elvis's AI-cloned voice commercially would violate Tennessee law.
Are there any safe ways to use celebrity-style voices?
Use stock AI voices that are not recognizably any specific person. Or work with celebrity voice licensing agencies who can negotiate legitimate use. Some celebrities are beginning to license their AI voice rights.
For the broader legal landscape, see voice cloning legal issues. For the legality by country, read is voice cloning legal.