← Back to blog

Voice cloning — the ability to faithfully reproduce a human voice from an audio sample — has transitioned in 18 months from a laboratory gadget to an accessible commercial tool. With only 30 seconds of reference audio, current models generate a synthetic voice indistinguishable from the original for 78% of human listeners. For businesses, legitimate uses are numerous. The risks are too. This analysis covers both.

Legitimate uses in business

Proprietary brand voice

Creating a unique synthetic voice for all its AI agents is the most widespread and least controversial use. The company has a voice actor recorded (with an explicit transfer agreement), creates a voice model from this recording, and has a 100% proprietary voice for its agents, voice servers, and audio advertisements. Cost: €2,000 to €8,000 depending on the recording duration. Advantage: total brand consistency, no legal risk.

Accessibility and multilingual content

A publishing group can clone an author's voice (with their contractual consent) to narrate their audiobooks in 40 languages, without the author having to record in each language. A trainer can create multilingual versions of their e-learning courses with their own cloned voice. These documented and consented uses are legally sound.

Customer voice personalization

Some companies are experimenting with advanced personalization: the AI agent subtly adapts its regional accent or language register according to the customer's profile. Not exactly voice cloning, but a fine-tuning of speech synthesis parameters that produces a similar effect of closeness.

Warning: Cloning a person's voice without their explicit consent is illegal in most European jurisdictions, particularly under the GDPR (biometric data), the right to voice image, and the new AI Act regulation (Article 52 on transparency obligations). Penalties can reach €20 million or 4% of global annual revenue.

Poorly managed risks

Internal vocal deepfake

Several documented incidents in 2025 involve cybercriminals using cloned voices of executives to authorize fraudulent transfers via phone calls. A cloned voice of the CEO ordering an "urgent confidential transfer" is convincing enough to deceive an unprepared employee. Companies must implement out-of-band verification protocols for any urgent financial requests received by phone.

Liability in case of misuse

If you deploy a cloned voice for your customer service and a customer is misled about the artificial nature of the conversation, your liability may be engaged. The European AI Act has mandated since January 2026 that all AI-generated content be clearly identified as such in interactions with consumers.

The legal framework in 2026

In Europe, three texts govern voice cloning:

What contracts should include

If you use the voice of an actor or collaborator to create a voice model: a rights transfer contract specifying the authorized uses (AI agents, advertising, e-learning), the duration (limited or perpetual), the territory, and the conditions for revocation. Without this contract, the person can demand the removal of the model and damages at any time.

"The voice is identity. Companies that treat voice cloning as a mere technical asset without legal dimension take considerable risks." — Lawyer specializing in digital law, Parisian firm

Best practices for responsible deployment