Table of Contents
Definition
Non-Consensual Intimate Imagery – Non-consensual image sharing, or non-consensual intimate image sharing (also called “non-consensual explicit imagery” or colloquially called “revenge porn”), refers to the act or threat of creating, publishing or sharing an intimate image or video without the consent of the individuals visible in it.
Related Terms
Revenge Porn, Image-Based Sexual Abuse (IBSA), Sextortion (if NCII is used to extort), Deepfakes.
Background
NCII in the Fediverse involves the sharing, distribution, or threat of distribution of private, intimate, or sexually explicit images or videos of an individual without their consent. This material can be real or, increasingly, synthetically generated (e.g., “deepfakes”). The decentralised and often pseudonymous nature of parts of the Fediverse can make it challenging to trace the origin of NCII and to ensure its complete removal once it has been shared. Volunteer moderators may become aware of NCII through reports from victims, concerned community members, or by encountering such material posted publicly. Dealing with NCII requires immediate, decisive action due to the severe emotional distress and harm it causes to victims, as well as its potential illegality in many jurisdictions.
Why We Care
NCII is a profound violation of an individual’s privacy, dignity, and sexual autonomy, often causing severe and lasting psychological, emotional, social, and sometimes financial harm to victims. The presence of NCII on a platform creates an unsafe and hostile environment, normalises sexual violence, and can deter individuals, particularly women and marginalised groups who are disproportionately targeted, from participating in online life. Failure to act swiftly and decisively against NCII not only fails victims but also damages the community’s reputation, erodes trust, and may carry legal implications for the platform or service provider. A zero-tolerance approach is essential.
Spotting NCII: What to Look For
Identification of NCII involves recognising the presence of intimate or sexually explicit imagery or videos shared without the consent of the person depicted.
Account Traits: The account posting NCII might be a throwaway account, an account with a history of abusive behaviour, an ex-partner of the victim, or an account specifically created to harass or shame an individual. In some cases, it may be an account that has been compromised.
Content Characteristics: Look for posts containing nude or sexually explicit images or videos of identifiable individuals. Context is key: is the material clearly private in nature, and is there any indication of consent for its public distribution on your community? Threats to share such material also fall under this category. Be aware of deepfakes or manipulated imagery designed to appear real.
Posting Patterns: NCII might be posted in a targeted way to harass an individual known within the community, or distributed more widely. It may be accompanied by derogatory comments, shaming language, or personal information about the victim (doxxing).
Behaviour: The account posting NCII often demonstrates clear malicious intent to harm, shame, or control the victim. They may show no regard for the victim’s distress or the rules of the community.
Key Questions for Assessment:
- “Does the content depict an identifiable individual in a private, sexually explicit, or intimate manner?”
- “Is there any evidence or clear indication that the individual depicted has consented to this specific image/video/audio being shared publicly in this context?” (Absence of consent should be assumed if not explicitly evident for public sharing).
- “Has a report been made by the person depicted, or someone on their behalf, stating the content is non-consensual?”
- “Is the sharing of the material accompanied by threats, harassment, or shaming?”
Before You Act: Common Pitfalls & Nuances
It is crucial to act with extreme urgency when NCII is identified, prioritising victim safety and content removal.
Consent is Key: The central issue is the lack of consent from the person depicted for the sharing of the image/video/audio in that specific context. Even if an image was once consensually created, sharing it publicly or with others without explicit consent for that sharing is NCII.
Deepfakes/Synthetic Media: Be aware that NCII can be synthetically generated but still cause immense harm and should be treated with the same seriousness.
Victim Blaming: Avoid any language or actions that could be construed as blaming the victim. The responsibility lies solely with the perpetrator sharing the NCII.
Common Gotchas:
- Delaying removal: NCII must be removed immediately upon confirmation.
- Inadequate removal: Ensure it is removed from all caches, direct messages if possible (platform dependent), and any other place it might have spread within your community.
- Not preserving evidence (securely and confidentially) before removal: This can be vital for law enforcement if the victim chooses to report.
- Re-traumatising the victim: Handle communications with utmost sensitivity and care. Focus on support and action.
Key Point: NCII is defined by the lack of consent for the distribution of intimate imagery. The priority is immediate removal and victim support. Assume lack of consent for public intimate image sharing unless proven otherwise.
Managing Suspected NCII: Key Steps
When NCII is suspected or reported:
- Immediate Removal of Content: This is the absolute first priority. Remove the NCII from public view on your community instantly to prevent further dissemination and harm.
- Securely Preserve Evidence (If Possible & Lawful): Before or during removal, if your platform allows and policies/laws permit, confidentially save evidence (e.g., screenshots, the imagery itself, details of the posting account). This must be stored with extreme security and only accessed for legitimate purposes like reporting to law enforcement at the victim’s request. Check local laws and platform policies on data handling.
- Immediately Ban Offending Accounts: Accounts confirmed to be posting NCII should be permanently banned from your community without warning.
- Support and Inform the Victim: If the victim is identifiable and reachable, inform them discreetly and empathetically of the action taken. Offer support and direct them to resources (e.g., organisations that help victims of NCII, options for reporting to law enforcement, guides on getting content removed from other platforms). Focus on their safety and well-being.
- Discuss with Team (if applicable): Alert designated moderators or your Service Administrator immediately. Coordinate actions for removal, evidence preservation (if applicable), and victim support. This is not a situation for wide discussion.
- Consider Reporting to Authorities/Specialist Organisations: Depending on jurisdiction and severity, and always with the victim’s informed consent where possible, the Service Administrator or victim may report the incident to law enforcement or specialist organisations that combat NCII.
Victim Tools and Resources
- CCRI Safety Center: If you are a victim or survivor of image-based sexual abuse, you may want some help deciding what to do next.
- StopNCII.org: StopNCII.org is a free tool designed to support victims of Non-Consensual Intimate Image (NCII) abuse.
- Victim Resources by Country
Example Community Guideance
Strike System: “Sharing non-consensual intimate imagery (NCII) is a severe violation and typically bypasses any strike system, leading to immediate and permanent bans.”
General Prohibition: “The distribution, posting, or threat of posting non-consensual intimate imagery (NCII), including sexually explicit images or videos of individuals shared without their explicit consent, is strictly forbidden.”
Strict Enforcement: “Any instance of sharing non-consensual intimate imagery (NCII) will result in an immediate permanent ban from this community, removal of the content, and may be reported to law enforcement and/or relevant support organisations. This includes AI-generated/deepfake explicit material presented as real individuals without consent.”
