Humans logo

Who Owns Your Digital Self

A New Identity Line

By Dr. Mozelle MartinPublished about 7 hours ago 3 min read

Denmark is preparing legislation that assigns legal ownership of identity traits to the people who carry them. This includes the face, the voice, and the physiological patterns that algorithms can duplicate with high confidence. I have examined synthetic media cases where cloned voices triggered panic inside families and where victims struggled to prove that footage circulating online was artificial. When identity becomes copyable at industrial scale, the legal system faces problems it was never built to manage.

The proposal treats human likeness as property. That phrasing is more than symbolic. Property law has sharper enforcement tools than most privacy laws. A property violation can move through courts with clearer standards and fewer loopholes than complaints about unauthorized data use. It forces developers, corporations, and content creators to secure permission before using a person’s features. The requirement may slow certain models, but it aligns responsibility with those who build and deploy the tools.

The expansion of deepfake tools has been rapid. The figure often referenced is a 900% increase in 5 years. I have watched the forensic environment shift from simple audio splicing to nearly perfect voice clones that reproduce breath patterns, pacing quirks, and the natural tremor in the throat. Video models can rebuild micro-expressions that usually require trained observers to detect. These systems alter the value of biometric evidence and challenge long-standing assumptions in law-enforcement training.

Neuroscience research shows that humans depend on facial and vocal patterns to evaluate threat and trust. When those cues can be counterfeited, the nervous system loses reliable reference points. In trauma therapy, I have seen clients destabilize when they cannot trust what their senses report. Synthetic media increases the risk for those who already manage distorted threat perception. It also affects individuals involved in domestic violence, stalking cases, or high-conflict situations, where fabricated evidence can escalate danger quickly.

Digital identity theft used to center on Social Security numbers, forged signatures, or misplaced documents. Now the theft targets sensory materials the body never evolved to protect. When someone uses your voice to deceive your family or your face to place you inside a fabricated event, the harm extends beyond reputation. It affects safety, legal standing, and psychological stability. I have observed cases where victims spent days trying to convince employers or relatives that the content was fake.

Denmark’s proposal creates a boundary that technology companies have resisted for years. Many have scraped millions of images and voice clips from public platforms to train their systems. Users rarely had the chance to refuse. The absence of a limit allowed identity traits to be treated as raw fuel for commercial and research purposes. Assigning ownership removes the idea that the human body is a communal resource for machine learning.

There is also a forensic advantage to this approach. Courts struggle when synthetic media contaminates evidence. Biometric markers lose their authority when they can be replicated convincingly. A clear ownership rule helps define what counts as misuse, which helps prosecutors and defense teams argue cases with more precision. It gives victims a legal foothold instead of leaving them to navigate abstract privacy claims that rarely result in accountability.

Some observers view Denmark’s approach as digital control. Others view it as digital autonomy. I see it as an attempt to stabilize identity in a world where replication tools move faster than psychological and legal adaptation. Humans rely on recognition patterns for safety, communication, and social functioning. When those patterns become editable, identity loses its anchor. Law must step in to restore some form of reliability.

The interesting question is what this move opens next. If a person owns their likeness, then this begs more questions:

  • what about their gait signature collected by security cameras?
  • what about voiceprint data stored by smart devices?
  • what about behavioral traces that reveal personality traits more accurately than self-reporting?

Identity extends beyond appearance, and this proposal suggests that future ownership claims may reach into those areas.

Denmark is not trying to freeze technology. It is trying to prevent identity from becoming so porous that consent becomes meaningless. The line being drawn is not about limiting innovation. It is about protecting the stability humans need to function in their communities. The modern digital environment demands limits that match the scale of the tools reshaping it. Whether other countries draw similar lines will determine how much control individuals keep over the parts of themselves that machines can now copy.

Sources That Don’t Suck

The Guardian. (2024). Denmark to address AI misuse by granting citizens copyright over their own features. Guardian News and Media.

Associated Press. (2024). Denmark bill proposes legal control of personal likeness and restricts unauthorized deepfakes. AP Publishing.

Time Magazine. (2024). Denmark’s move toward copyright-style protection for image and voice to counter synthetic media misuse. Time USA, LLC.

fact or fictionhumanitysciencesocial mediafeature

About the Creator

Dr. Mozelle Martin

Behavioral analyst and investigative writer examining how people, institutions, and narratives behave under pressure—and what remains when systems fail.

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

Dr. Mozelle Martin is not accepting comments at the moment
Want to show your support? Send them a one-off tip.

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2026 Creatd, Inc. All Rights Reserved.