Online Safety

Non-Consensual Intimate Images: How to Find Them and Take Action

If intimate images of you have been shared online without your consent, you're not powerless. Learn how to find them, get them removed, and protect yourself.

By Reverse Face Editorial··9 min read

The non-consensual sharing of intimate images — sometimes called "revenge porn" — is one of the most devastating forms of online abuse. According to the Cyber Civil Rights Initiative, approximately 1 in 8 U.S. adults has been a victim of non-consensual intimate image (NCII) distribution.

If this has happened to you — or if you're afraid it might — this guide will help you find where images exist, get them removed, and protect yourself going forward.

If you are in crisis: Contact the Cyber Civil Rights Initiative Crisis Helpline at 844-878-2274 or visit cybercivilrights.org. If you are in immediate danger, call 911.

The Scale of the Problem

NCII is not a rare occurrence:

  • The Cyber Civil Rights Initiative found that 93% of NCII victims reported significant emotional distress, and 51% reported suicidal thoughts.
  • A Data & Society Research Institute study found that roughly 10 million Americans have had intimate images shared without their consent.
  • The National Network to End Domestic Violence (NNEDV) reports that NCII is frequently used as a tool of domestic abuse and coercive control.
  • The UK Revenge Porn Helpline reported a 90% increase in cases between 2020 and 2023, driven in part by deepfake technology.

The problem extends beyond "revenge" by ex-partners. NCII can result from hacked devices, stolen photos, voyeurism, sextortion schemes, and AI-generated deepfakes.

How to Find Non-Consensual Intimate Images Online

Traditional reverse image search tools match exact or near-exact images. But intimate images are often cropped, filtered, or embedded in videos.

Reverse Face uses facial recognition to match your face across the public internet — even when images have been altered, cropped, or embedded in different contexts.

Step 2: Document Everything

Before requesting removal, create a thorough evidence record:

  • Screenshot every page where the image appears (include the URL bar)
  • Save the full URL of each page
  • Record the date and time you discovered the content
  • Capture any identifying information about the uploader
  • Use the Wayback Machine (web.archive.org) to preserve evidence

The Federal Trade Commission emphasizes that documentation is critical for both legal proceedings and platform takedown requests.

Step 3: Request Removal

Most major platforms and search engines have specific NCII removal processes:

  • Google: Submit a removal request through their content removal tool.
  • Meta (Facebook, Instagram): StopNCII.org partnership allows hash-based blocking.
  • Reddit: Dedicated form for reporting involuntary intimate imagery.
  • X, TikTok, Snapchat, PornHub: All have NCII reporting processes.

The Cyber Civil Rights Initiative maintains a comprehensive removal guide with direct links for each platform.

U.S. State Laws

As of 2025, 49 out of 50 U.S. states plus Washington D.C. have laws criminalizing NCII distribution. The NCSL tracks these laws — penalties range from misdemeanors to felonies.

Federal Legislation

The TAKE IT DOWN Act, signed into law in 2025, requires platforms to remove NCII — including AI-generated deepfakes — within 48 hours of receiving a complaint.

International Protections

  • GDPR (EU): Intimate images shared without consent violate GDPR. Article 17 provides the right to erasure.
  • UK Online Safety Act: Criminalizes sharing intimate images without consent, including deepfakes.
  • Australia's eSafety Commissioner: Can order platforms to remove NCII and issue civil penalties.

The Deepfake Dimension

AI-generated deepfakes have added a disturbing new layer. The World Economic Forum identified AI-generated misinformation as a top-10 global risk in 2024. And the FBI issued a warning that sextortion schemes increasingly use AI to create explicit content from innocent social media photos.

This makes proactive monitoring critical. Continuous face monitoring alerts you whenever your face is detected on a new website.

Where to Get Help

  • Cyber Civil Rights Initiative: Free crisis helpline at 844-878-2274
  • StopNCII.org: Create hashes of intimate images to prevent distribution
  • National Domestic Violence Hotline: 1-800-799-7233
  • FBI IC3: Report sextortion and NCII-related cybercrime at ic3.gov
  • NCMEC (for minors): Report CSAM at missingkids.org

Protecting Yourself Going Forward

  • Run regular face searches using Reverse Face
  • Enable continuous monitoring
  • Review social media privacy settings — limit who can see your photos
  • Use StopNCII.org proactively
  • Enable two-factor authentication on all accounts with personal images
  • Be cautious with intimate content — even "disappearing" messages can be captured

The Bottom Line

Non-consensual intimate image sharing is a crime in almost every U.S. state and many countries worldwide. If you're a victim, you have legal rights, platform removal options, and free support resources. With tools like Reverse Face, you can proactively discover where your image appears and take action before it spreads further.

Your image belongs to you. You deserve to control how it's used.