Search

Digital replicas at work: protecting performers in the age of AI cloning

November 28, 2025 -

For decades, performers and their unions have fought to secure recognition and protection for their creative work. Today, a new frontier of risks and opportunities is opening up for performers in the form of digital replicas. Digital replicas are computer-generated imitations of real people created with artificial intelligence systems or equivalent digital technologies. In 2023, the US actors’ union SAG-AFTRA negotiated new contract clauses for the use of replicas in film & television, confirming their presence in production processes.

Digital replicas are most commonly used to enhance or replace the work of actors, especially voice actors, in film, gaming, publishing and advertising, but early adoption is also underway in fashion and music. For many performers worldwide, replicas may soon determine whether work remains empowering and rewarding or becomes exploitative and precarious.

Digital replicas are also referred to as digital doubles, avatars, voice clones, digital twins, or deepfakes, each expression carrying different connotations. Industry and policymakers increasingly prefer “digital replica” which suggest professional and consensual use, while “deepfake” remains strongly associated with online abuse and deception. The lack of consistent language reflects both the novelty of the technology and unsettled industry practices.

Whether a digital replica is a risk or an opportunity to a performer depends on the degree of control the artist retains over the process of its creation and this new class of “assets”, capable of reproducing their likeness and skills at scale, without their direct physical participation. To understand how such control can be preserved, we must examine how replicas are created and what legal and business frameworks currently govern their use.

The creation of a digital replica, more commonly referred to as “digital cloning”, usually follows four key steps:

  1. Selecting a cloning tool: Cloning tools are advanced computer programs designed to recreate a person’s likeness and movement. A performer or their engager can adopt an off-the-shelf cloning tool, typically provided through a subscription platform, or commission a tailored system built in-house or by an external developer. In most cases, the provider’s terms of service determine who holds control over the materials used and outputs generated during the cloning process.
  2. Collecting and processing likeness data: Voice recordings, images, video, or motion-capture scans of the performer are gathered. These materials often need to be converted into a format compatible with the cloning tool, creating further assets such as biometric templates and other technical artefacts. This new ‘likeness data’ will later be used for fine-tuning.
  3. Fine-tuning the tool to the performer’s likeness: The AI tool is calibrated on the performer’s data previously collected, resulting in a fine-tuned model. A fine-tuned model is a persistent digital asset capable of generating high-fidelity imitations of a specific performer because their likeness and skills have been encoded into it program through fine-tuning.
  4. Generating outputs in the likeness of the performer: the fine-tuned model produces images, videos, or voice recordings in the likeness of the performer when prompted. Often those outputs will need further editing or processing before their integration into new media, products or services, such as films, training content, or chatbots.

In practice, the digital cloning process is rarely initiated or shaped by the performer. Employers, commissioners, or clients usually decide whether a replica will be created, select the technology provider, and define the specifications of use. Performers are typically only involved later, often once likeness data needs to be captured. As a result, digital replicas are more often wrapped into existing work arrangements under terms set by others, rather than being tools that performers choose and integrate into their own practice. This imbalance in decision-making power is a major reason why many of the risks associated with replicas arise in the first place.

The moment when performers are most at risk of losing control over their digital selves comes with the creation and use of the fine-tuned model. Unlike a recording, which is fixed, a model is a re-usable asset capable of generating unlimited almost like-for-like performances in a person’s voice or physical appearance. Whoever owns and controls this model effectively decides how a performer’s likeness and skills are used digitally in the future.

Determining the ownership of fine-tuned models is not straightforward, as they combine several assets subject to different rights, held by different parties.

  • The fine-tuned model is effectively a piece of computer program, protected by copyright and generally owned by the technology provider. In legal terms, a fine-tuned model is often treated as a derivative copyright work of the provider’s tool.
  • The performer’s recorded performances are also programmed into the fine-tuned model, which may attract performers’ intellectual property rights in certain, limited contexts as performers’ rights do not typically extend to imitations, digital or otherwise. Whether developing or fine-tuning cloning tools using protected performances infringes performers’ economic or moral rights remains legally unclear and untested in court (at the time of writing).
  • The performer’s likeness and personal data encoded into the fine-tuned model, including sensitive or biometric information can be protected by personal data regulations like GDPR and, in some jurisdictions, likeness rights. Note that neither sets of rights protecting the performer are subject to international harmonisation, so their enforcement in contracts involving cross-border collaborations can be a challenge.

Because statutory rights over fine-tuned models and their outputs are fragmented and uncertain, contracts often determine who controls replicas in practice. This contract-based approach typically disadvantages performers.

Common patterns include:

  • Perpetual buy-outs requiring performers to assign intellectual property, likeness, and data rights indefinitely;
  • Non-negotiable terms granting commissioners broad and long-term control over the fine-tuned models and the commercialisation of its outputs, with little possibility for termination or renegotiation of those terms.

Performers are often presented with those terms on a “take it or leave it” basis, sometimes as a condition of employment. This creates a tension between commercial rights of the commissioner and the privacy and dignity of the performer.

In this context, performers and unions should push for agreements that:

  1. Treat contracts for digital replicas as distinct from standard performance contracts, recognising the unique risks and long-term implications of replica assets.
  2. Distinguish clearly between rights to likeness data, rights to the fine-tuned model, and rights to outputs.
  3. Establish clear rules on access, storage, and security of fine-tuned models, including restrictions on third-party use.
  4. Provide clear termination rights and time limits to prevent perpetual exploitation of fine-tuned models without opportunities to withdraw consent or renegotiate.

Looking ahead, the sustainable integration of digital replicas in the creative sectors will require moving towards a performer-centred model, where actors are firmly in control of their own replicas and the technology choices that underpin them. In this vision, a performer’s replica becomes part of their professional practice and the services they offer, rather than a tool imposed through external decisions. Industry practices, business models, and technology access will need to evolve to accommodate this shift. Such a framework would achieve the best balance: allowing engagers and commissioners to access the efficiency and creative potential of the technology, while ensuring performers remain protected from the risks of losing control over their digital selves.

#AI   #deepfake   #digital cloning   #digital replica   #performers  

About the author

Mathilde Pavis

Dr Mathilde Pavis is an expert in intellectual property law, ethics and new technologies. Mathilde has 10+ years experience in research and training. Her work has been cited by the UK Parliament to inform new laws on artificial intelligence. Mathilde now advises governments, organisations and businesses about the impact of AI on intellectual property and sensitive data. Mathilde writes for The Times and The Independent about law in new technologies, and has appeared on national programs for Channel 4 News and BBC Radio 4 to discuss AI regulations. She has written 50+ academic and online publications. Dr Mathilde Pavis is a legal adviser and founder of Replique, an advisory firm specialised in AI and digital replicas for the creative industries. Follow Mathilde on LinkedIn for industry updates and free resources on the topic.

Read more

European Union This project and publication have received the support of the European Union. The publication reflects the views of the authors only and the European Commission cannot be held responsible for any use of the information contained therein.