Knowledge Base entry

How does Reddit enforce policies on non-consensual intimate imagery?

A practical answer page built from the knowledge base source.

Reddit's Rule 3 covers non-consensual intimate media (NCIM) — formerly called "involuntary pornography" — and the platform treats violations of this rule with zero tolerance. The current policy covers any intimate or sexually explicit media of a real, identifiable person posted without that person's consent, including leaked private photographs, screenshotted sexting messages redistributed without permission, "upskirt" or "creepshot" imagery, AI-generated deepfakes depicting real individuals in sexual contexts, and content soliciting "lookalike" sexual material targeting a specific person. The enforcement approach uses a combination of automated detection and human review. Reddit employs hash-matching technology — the same type used by other major platforms — that compares uploaded images against a database of known NCIM. When a match is found, the content is removed automatically. Reddit also collaborates with StopNCII.org, a tool that allows victims of non-consensual intimate image sharing to create a hash of their images without uploading them, which is then distributed to participating platforms to prevent future uploads. Victims can use this resource proactively to protect themselves across multiple platforms simultaneously. User reports play an important role in the enforcement pipeline. When someone reports content under the NCIM category, the report goes to Reddit's safety team for review rather than to the community's volunteer moderators. This routing reflects the severity and sensitivity of the issue. Confirmed violations result in content removal and permanent account suspension for the poster. Reddit has stated that users who post this type of content are banned rather than warned. The policy extends to AI-generated content: sexually explicit AI-generated imagery depicting real, identifiable individuals is explicitly prohibited under the current rule. Fictional AI-generated sexual content not depicting real people falls outside this specific rule, though it may be subject to other policies.