Skip to main content

Consent & Digital Likeness

When your voice, face, name, performance, style, or persona is replicated, the first need is orientation, records, and support boundaries.

Creator crisis

Your identity or creative persona was copied

A performer, voice actor, artist, writer, designer, educator, journalist, podcaster, or public creator sees their likeness, voice, name, identity, or recognizable creative persona replicated, scanned, simulated, cloned, or commercially reused without meaningful consent.

What This Issue Means

  • Consent and digital likeness cover permission, attribution, voice, face, name, identity, performance capture, style or persona concerns, digital twins, and synthetic media.
  • Copyright may matter, but it may not fully address voice, likeness, identity, right-of-publicity, platform, contract, union, safety, or privacy questions.
  • The practical first move is to preserve evidence and separate urgent safety needs from longer legal or policy questions.

Source-Safe Current Landscape

  • Voice, likeness, and identity are creator-rights concerns that copyright alone may not fully address.
  • Public authority, union/guild, and platform-policy source categories treat unauthorized digital replicas and synthetic-media disclosure as serious issues, but rules vary by context.
  • The STC evidence base connects digital identity misuse to AI, contracts, safety, harassment, attribution, and compensation.

Source footing: Grounded in the Creator Rights PRD, Source Matrix Row 2, Row 3, Row 10, congressional source categories, Copyright Office digital-replica source categories, guild source categories, and platform synthetic-media source categories.

What STC Advocates

STC advocates consent, compensation, disclosure, usage limits, secure records, and creator control over how personal attributes, performances, and recognizable creative identities are captured, stored, simulated, and reused.

Demand 7: Explicit Consent for Digital Likeness

Demand 8: Protection Against Digital Twins

Demand 9: Standards for Digital Replication

Demand 2: Transparent Usage Tracking

What Creators Can Do Right Now

  • Preserve links, files, screenshots, upload dates, platform notices, contracts, permissions, correspondence, and any record of consent or refusal.
  • If the issue involves intimate images, threats, stalking, doxxing, or immediate danger, prioritize emergency, crisis, legal, workplace, or platform safety resources before advocacy research.
  • Read the copyright and AI training pages for adjacent questions about legal protection, datasets, training, and derivative uses.
  • Avoid assuming every style imitation is illegal or that copyright automatically protects face, voice, persona, or style.
  • Sign the Declaration to support explicit consent, digital-twin protections, and replication standards.

Evidence Connection

  • Safety and Harassment has 70 evidence items across 40 of 43 discipline sheets.
  • Audio, performer, visual, writing, design, educator, journalist, and public-creator evidence all connect to synthetic voice, scanning, attribution, and identity misuse.
  • The consent lane also connects to provenance, platform terms, contracts, copyright, and AI training.
Explore Hub Evidence

Rights education, not legal advice

These pages offer general creator-rights education and advocacy orientation. Individual disputes depend on facts, contracts, jurisdiction, platform rules, and current law. Use this as a starting point, preserve records, and seek qualified legal help for individual claims.

If you are in immediate danger or dealing with crisis, treat that as the priority and use emergency, crisis, legal, workplace, or platform safety resources before continuing.

Demand Consent for Digital Likeness

Add your voice to the call for consent, disclosure, compensation, and standards for digital replication.