Introduction

Voice cloning technology has outpaced legislation. While using your own cloned voice is unambiguously legal, cloning others' voices enters complex legal territory that varies by country, state, and context.

This guide summarizes the current legal landscape. It is not legal advice — consult a lawyer for your specific situation.

United States

Federal level: No comprehensive federal law specifically addressing AI voice cloning as of 2026. However, several existing laws apply:

  • Right of Publicity: Most states protect individuals' likeness and voice from unauthorized commercial use
  • FTC regulations: Prohibit deceptive practices, including using fake voices to mislead consumers
  • Copyright law: Voice recordings are copyrightable, but a voice itself cannot be copyrighted

State laws (growing rapidly):

  • Tennessee (ELVIS Act, 2024): Specifically protects voice from AI replication without consent
  • California: Strong right of publicity protections, proposed AI voice bills in 2025-2026
  • New York: Protects voice likeness under civil rights law
  • Several other states have proposed or enacted AI-specific voice protection bills

European Union

EU AI Act: Classifies deepfakes (including voice cloning) as requiring transparency. If AI-generated voice content could be mistaken for a real person, disclosure is mandatory.

GDPR: A person's voice is biometric data. Processing it (including for cloning) requires explicit consent under GDPR.

Right of personality: Individual EU member states protect voice as part of personal identity.

United Kingdom

Online Safety Act: Includes provisions about synthetic media. Platforms must label AI-generated content.

Common law: Passing off (misrepresenting AI voice as a real person) can be actionable.

When consent is required:

  • Cloning someone else's voice (always)
  • Using a cloned voice commercially (check platform terms)
  • Creating content that could be mistaken for the real person speaking

When consent is NOT required:

  • Cloning your own voice
  • Using stock AI voices (pre-consented by the platform)
  • Clearly satirical or obviously fictional use (varies by jurisdiction)

Best practice: Always get written consent before cloning any voice that is not your own. Include:

  • What the voice will be used for
  • How long the clone will be retained
  • Whether the voice owner can revoke consent
  • Compensation terms (if applicable)

Deepfake Regulations

Voice cloning for deceptive purposes — making someone appear to say something they did not — is illegal or regulated in most jurisdictions:

  • US: Illegal if used for fraud, defamation, or election interference
  • EU: Must be labeled as AI-generated under the AI Act
  • China: Comprehensive deepfake regulations requiring consent and labeling

Practical Guidelines

Safe:

  • Clone your own voice
  • Use stock voices from licensed platforms
  • Get written consent before cloning anyone else
  • Disclose AI voice usage when context requires it
  • Use for legitimate purposes (content creation, accessibility, entertainment)

Risky:

  • Cloning a public figure's voice (even for parody — laws vary)
  • Using cloned voices in advertising without disclosure
  • Creating audio that could be mistaken for real statements

Illegal:

  • Voice cloning for fraud or scams
  • Impersonating someone to cause harm
  • Creating non-consensual intimate content
  • Election interference using cloned political voices

Frequently Asked Questions

Is it legal to clone a celebrity's voice?

Generally no for commercial use without their consent. Right of publicity laws protect celebrities' voices. For clearly labeled parody or commentary, there may be fair use exceptions, but this is legally contested.

Do I need to disclose when using my own cloned voice?

No legal requirement in most jurisdictions. However, some platforms (YouTube, podcasts) recommend disclosure for transparency. The EU AI Act may require disclosure if the content could be mistaken for live speech.

Can I clone a deceased person's voice?

Laws vary. Some states extend right of publicity protections beyond death (Tennessee: 10 years, California: 70 years). In many cases, the estate controls the rights.

What happens if I am caught cloning a voice without consent?

Depending on jurisdiction: civil lawsuits (damages), platform bans, and potentially criminal charges if the cloning was used for fraud or harm.

For the technical cloning process, see voice cloning tutorial. For ethical use, read voice cloning for accessibility.