CSAM

Definition

Child Sexual Abuse Material – Imagery or videos which show a person who is a child and engaged in or is depicted as being engaged in explicit sexual activity.

Related Terms

Child Pornography, Self-Generated CSAM (SG-CSAM), Computer Generated CSAM (CG-CSAM), Child Sexual Abuse Imagery (CSAI), Child Sexual Exploitation Imagery (CSEI), Indecent Images of Children (IIOC), Child Sexual Exploitation and Abuse (CSEA) (umbrella term that CSAM falls under).

Please Note: It is inappropriate to refer to this material as pornography. Although some laws refer to “child pornography”, pornography implies consent, which a child can never give.

Background

CSAM (Child Sexual Abuse Material) is illegal content that depicts the sexual abuse and exploitation of children. Its creation inherently involves severe child abuse, and its distribution perpetuates that harm and fuels demand. The Fediverse, like any online platform, can be misused by offenders to share, distribute, or solicit CSAM. This can occur through public posts (rarely, due to high visibility), private messages, or by directing users to off-platform locations where CSAM is stored or traded.

Identification and encountering CSAM is deeply disturbing and requires an immediate, unequivocal response. There is zero tolerance for CSAM. The primary and only acceptable action upon encountering suspected CSAM is its immediate reporting to the administrator, who is then typically legally and morally obligated to report it to law enforcement and specialist organisations.

  • CSAM Primer – legality, definitions, and detection services
  • CSAM Reporting – legal requirements and child safety hotlines

Why We Care

Combating CSAM is a fundamental responsibility of any online service provider and community. CSAM represents the documentation of horrific crimes against children. Its existence and distribution cause profound and ongoing harm to child victims. There is an absolute moral, ethical, and legal imperative to prevent its spread, report its presence, and protect children.

Failure to act decisively and correctly when CSAM is encountered has severe legal ramifications, endangers children, and signifies a catastrophic failure of platform responsibility. There are no mitigating circumstances for the presence of CSAM.

In a four month study, IFTAS found numerous copies of CSAM across the network. ActivityPub services are in many ways a free web hosting service that offers private messaging to anonymous accounts, making the Fediverse an easy target for criminal sharing or selling of illegal material. Read our report here.

Spotting CSAM: What to Look For

Identification of CSAM involves recognising visual media that depicts children in a sexually abusive context. This is a law enforcement determination ultimately, but clear indicators require immediate action. Services exist to automatically compare uploaded or federated media to known media using what is called “hash and match” technology.

See CSAM Primer for links to detection and reporting services.

IFTAS’ Do Not Interact list includes servers known to host and distribute CSAM.

Please note, in most countries, computer-generated CSAM is still CSAM and distributing it (e.g. via ActivityPub) is an illegal act.

Account Traits: Accounts sharing or soliciting CSAM often use anonymous or fake profiles. They may operate secretively or attempt to use coded language. However, CSAM can also be shared by compromised accounts or by individuals who may not fully grasp the illegality and harm. Regardless of the account’s apparent intent, the material itself dictates the response.

Content Characteristics: The content is visual media (images, videos, sometimes highly realistic drawings or AI-generated images explicitly presented as depicting child abuse) showing:

  • Children engaged in sexual acts (with adults or other children).
  • Children in sexualised poses that are clearly exploitative.
  • Depictions of non-consensual sexual acts against children.
  • Any other content meeting the legal definition of CSAM in the relevant jurisdiction.

Posting Patterns: CSAM may be shared directly on a platform (though offenders often try to avoid direct posting on moderated public spaces), or users might post links to CSAM hosted on other websites, cloud storage, or peer-to-peer networks. It might be shared in private messages or hidden/obscure groups. IFTAS studies have shown that CSAM that portrays older teens is freely shared on the network, whereas pre-teen material tends to be shared in direct messages.

Behaviour: Accounts involved in CSAM may attempt to trade it, solicit it, or groom others into producing it. The key is the presence of the illegal material itself.

Before You Act: Common Pitfalls & Nuances

  • DO NOT DOWNLOAD, SHARE, OR FURTHER DISTRIBUTE SUSPECTED CSAM: Possessing or distributing CSAM is illegal in most jurisdictions. Doing so, even with the intent to “investigate” or show it to other moderators (beyond the designated Service Admin/Safety Officer for formal reporting), can have severe legal consequences and re-traumatises victims.
  • DO NOT INVESTIGATE INDEPENDENTLY: You are not a law enforcement officer. Do not try to identify victims, engage with perpetrators, or trace the material’s origin. This can contaminate evidence and alert offenders.
  • DO NOT CONFRONT THE POSTER: This can cause them to delete evidence or go underground.
  • PRIORITISE IMMEDIATE REPORTING: Every moment CSAM remains accessible, or its existence unreported, is a failure.
  • TRAUMA AWARENESS: Viewing CSAM is highly traumatic. Moderators who encounter it should have access to mental health support. Consider implementing guidance from our Moderation Workflow Harm Reduction page.

Managing Suspected CSAM: Key Steps

The only role of a community moderator is to identify and escalate to their Service Administrator or designated internal reporting channel.

  • IMMEDIATE, URGENT REPORT TO SERVICE ADMINISTRATOR / DESIGNATED SAFETY OFFICER: If you encounter any content you suspect to be CSAM, immediately report all details (account name, URL of content if possible, time/date seen) to your Service Administrator or the person/process designated by your instance for handling critical safety reports. This is your sole and overriding responsibility.
  • DO NOT COPY/SAVE LOCALLY: Do not download, screenshot, or store the material yourself. Describe it in your report to the administrator if necessary.
  • Service Administrator Actions: The Service Administrator (or designated safety personnel) MUST:
    1. Securely preserve the suspected CSAM and associated data (e.g., posting account details, IP addresses if available, timestamps) in a forensically sound manner, adhering to all legal requirements for chain of custody. Platform tools may assist with this, or it may require server-level actions.
    2. IMMEDIATELY report the suspected CSAM to the appropriate national law enforcement agency that handles online child exploitation and/or to a designated international body like NCMEC (which acts as a global hub for such reports) or the IWF (in the UK). This is a legal obligation in many jurisdictions.
    3. Take steps to make the content inaccessible on their platform, often done in consultation with or after reporting to law enforcement to ensure evidence is not lost.
    4. Permanently ban any account confirmed to have shared CSAM.
    5. Cooperate fully with any subsequent law enforcement investigation.
  • Seek Support: If you have been exposed to CSAM, seek support from your moderation team, instance administrator, or mental health resources. Exposure is known to cause vicarious trauma. See our notes on Moderation Workflow Harm Reduction.

Example Community Guidance

Strike System: “CSAM is outside any warning or strike system. It is illegal, and its presence leads to immediate permanent bans and reporting to international law enforcement and child protection agencies.”

General Prohibition: “The possession, creation, sharing, distribution, or solicitation of Child Sexual Abuse Material (CSAM) is illegal and strictly prohibited on this service.”

Strict Enforcement: “Any user found to be involved with CSAM will be permanently banned immediately. All instances of suspected CSAM will be reported to the [INSERT LOCAL JURISDICTION BODY eg NCMEC IWF) and/or relevant national and international law enforcement agencies, along with any identifying information of the involved accounts, in accordance with global legal standards and our commitment to child safety.”


IFTAS
IFTAS
@about.iftas.org@about.iftas.org

Nonprofit trust and safety support for volunteer social web content moderators

47 posts
292 followers

IFTAS is a non-profit organisation committed to advocating for independent, sovereign technology, empowering and supporting the people who keep decentralised social platforms safe, fair, and inclusive..