Your Deepfake Ex Is Now a Crime in 14 States — Here’s Where It Isn’t

Your Deepfake Ex Is Now a Crime in 14 States — Here’s Where It Isn’t

Imagine waking up to find a realistic video of yourself online — one you never made, never agreed to, and never knew existed. Your face. Your name. But not your actions. This is the nightmare that thousands of people across the United States face because of deepfake technology, and the law is only just starting to catch up.

As of today, 14 states have passed specific laws making it a crime to create or share non-consensual deepfake intimate images — sometimes called “deepfake porn” or a new form of revenge porn. But if you live outside those states, you may have very little legal protection. Here’s what you need to know about where the law stands, what it covers, and what still needs to change.

What Is a Deepfake, Exactly?

A deepfake is a piece of synthetic media — usually a photo or video — created using artificial intelligence. The technology can take a real person’s face and place it onto someone else’s body, making it look incredibly realistic. In the wrong hands, this tool is used to create fake intimate or sexual content featuring real people without their knowledge or consent.

Unlike traditional edited photos, deepfakes are convincing enough to fool friends, family members, employers, and even some law enforcement officials. That’s what makes them so dangerous — and so damaging to victims.

The 14 States That Have Taken Action

The following states have enacted laws that specifically address non-consensual deepfake intimate images. While the exact language and penalties vary, all of these laws recognize deepfakes as a distinct form of harm separate from traditional revenge porn:

  • California — One of the first states to act, California has laws targeting both the distribution of non-consensual intimate deepfakes and their use in political misinformation.
  • Texas — Texas criminalizes the creation and sharing of synthetic sexual imagery without consent.
  • Virginia — Virginia broadened its existing revenge porn laws to include computer-generated or digitally altered images.
  • Georgia — Georgia’s law covers intimate deepfakes shared with the intent to harm or harass.
  • Hawaii — Hawaii targets the disclosure of non-consensual deepfake intimate content.
  • Illinois — Illinois law addresses both the sharing and the creation of deepfake sexual content.
  • Minnesota — Minnesota criminalizes non-consensual distribution of realistic synthetic intimate images.
  • New York — New York gives victims a civil right to sue creators and distributors of deepfake intimate images.
  • North Carolina — North Carolina updated its revenge porn statute to include AI-generated content.
  • South Dakota — South Dakota includes digitally altered images in its non-consensual porn laws.
  • Arizona — Arizona added deepfake sexual content to its existing laws on sexual exploitation.
  • Indiana — Indiana criminalizes the creation and sharing of non-consensual intimate synthetic media.
  • Tennessee — Tennessee passed laws targeting the use of AI to create intimate images of real people without consent.
  • Washington — Washington state has some of the broader laws in the country, covering both criminal penalties and civil remedies for victims.

It’s worth noting that the strength and scope of these laws differ significantly from state to state. Some only address sharing the content, not creating it. Some require proving intent to harm, which can be difficult. And civil versus criminal remedies offer very different levels of protection.

Where Victims Have Little to No Protection

If you live in one of the remaining 36 states that haven’t passed specific deepfake intimate image laws, your legal options are limited. You might be able to pursue action under:

  • Existing revenge porn laws (if your state has them), though many don’t cover AI-generated images
  • General harassment or stalking statutes
  • Copyright law, in rare cases where you own the original images used
  • Civil lawsuits for defamation or emotional distress

The problem is that none of these were designed with deepfakes in mind. They often fall short because prosecutors and judges must work within frameworks built for very different types of harm. Victims are frequently told there’s nothing law enforcement can do.

Is There a Federal Law?

As of now, there is no comprehensive federal law specifically targeting non-consensual deepfake intimate images. However, Congress has been working on it. The DEFIANCE Act (Disrupt Explicit Forged Images and Non-Consensual Edits) was introduced in the U.S. Senate and received significant bipartisan support. If passed, it would create a federal civil cause of action for victims of non-consensual intimate deepfakes, allowing them to sue creators and distributors in federal court.

There is also the NO FAKES Act, which targets the use of AI to replicate a person’s voice or likeness without permission — though it has a broader entertainment industry focus. Advocates argue that a dedicated federal criminal law is still needed to fill the gaps that state patchwork legislation leaves behind.

Who Is Being Targeted?

Research and reporting consistently show that the overwhelming majority of deepfake intimate image victims are women. Studies have found that more than 90% of non-consensual deepfake content targets women — often created by ex-partners, acquaintances, or strangers online. This is why the term “deepfake ex” has entered the public conversation.

But it’s not only adults who are at risk. There have been documented cases involving minors, as well as public figures such as celebrities and politicians. Some victims are targeted for financial extortion. Others are harassed simply out of spite or for entertainment within online communities dedicated to creating this kind of content.

Why Laws Alone Aren’t Enough

Even in states where deepfake intimate images are illegal, enforcement is a serious challenge. Many deepfakes are created and shared on overseas platforms that are difficult for U.S. law enforcement to reach. Identifying anonymous creators requires technical expertise and resources that most local police departments don’t have. And victims often face a painful burden — having to prove the image is fake and that they didn’t consent, sometimes while the content continues to spread.

Tech companies also play a role. While some platforms like Google and Meta have policies against non-consensual intimate images, takedown processes can be slow, and content often reappears on other sites. Advocacy groups are pushing for stronger platform accountability, including faster removal tools and proactive detection systems.

What You Can Do If You’re a Victim

If you discover that a deepfake intimate image of you has been created or shared, here are some steps you can take:

  • Document everything — Take screenshots with timestamps before attempting to have content removed.
  • Report to the platform — Most major social media and video platforms have reporting tools for non-consensual intimate content.
  • Contact law enforcement — Even if your state doesn’t have a specific deepfake law, report it to local police and the FBI’s Internet Crime Complaint Center (IC3).
  • Reach out to advocacy organizations — Groups like the Cyber Civil Rights Initiative offer free support, resources, and crisis helplines for victims.
  • Consult an attorney — An attorney familiar with digital privacy and cyber harassment law can help you understand your options, including civil suits.
  • Use image removal tools — Google, Microsoft Bing, and StopNCII.org offer tools to help prevent intimate images from appearing in search results or spreading further.

The Bigger Picture: Technology Is Moving Faster Than the Law

The deepfake problem is part of a larger challenge: artificial intelligence is advancing at a speed that lawmakers, courts, and regulators simply aren’t built to match. What was once a niche technology used by film studios and researchers is now accessible to almost anyone with a smartphone and an internet connection. Free and low-cost deepfake apps are widely available, making it easier than ever to create convincing fake content in minutes.

This isn’t just a legal problem — it’s a social one. It requires action from technology companies, platforms, educators, and policymakers working together. Laws matter, but they’re only one piece of a much larger puzzle.

Looking Ahead

The momentum is real. More states are working on legislation, and federal bills are gaining traction. Public awareness is growing, partly because high-profile cases involving celebrities have brought the issue to national attention. Survivors are also speaking out, pushing back against the shame that has historically kept victims silent.

Still, if you live in one of the 36 states without specific protections, you are largely on your own — and that needs to change. The gap between where the law is and where it needs to be is still wide, and for victims living in that gap, the consequences are very real.

Staying informed about your state’s laws, knowing your options, and supporting stronger legislation are the most important things you can do — whether or not you’ve been directly affected. Because in an age where anyone’s face can be used without their permission, this is everyone’s issue.

Scroll to Top