Table of Contents
If you suspect a child is in immediate danger in any way, contact the police immediately.
The information provided by IFTAS is for general guidance and informational purposes only. It does not constitute legal advice and should not be relied upon as such. IFTAS is not a law firm and does not offer legal services. For advice on legal or regulatory matters, please consult a qualified professional.
User Generated Content
If you operate an ActivityPub service you are an electronic communications provider and you likely meet the definitions of ESP, ISP, online service provider, and other nomenclature in law that describes electronic communications services. Services that federate with third parties, and/or have their content feeds visible to public users, and/or allow user account creation, are liable for the content they host and display to end users in all jurisdictions.
National and Extranational Law
Scroll down in the linked page to review the legality of real/realistic; fictional; and possession in most countries.
Legal status of fictional pornography depicting minors
Regardless of its legality in a given jurisdiction, if your content is available to end users in a jurisdiction where such content is illegal, you are liable for its availability.
Detection
Services exist to compare stored media with hashes of known material, and ML-assisted perceptual matches. For the most part, these are heavily restricted and will require your ability to sign legal agreements with the service providers.
CDN
- Cloudflare’s CSAM Scanning Tool uses hash-matching technology to detect known child sexual abuse material (CSAM) on websites using its services. When CSAM is detected, Cloudflare notifies the website owner or hosting provider, who is responsible for removing the content and reporting it to the appropriate authorities. This tool previously required NCMEC credentials, meaning only US entities could use it, however this is no longer the case.
- (Firefish CloudFlare configuration: https://socialweb.coop/blog/firefish-cloudflare-quickfix-r2-tutorial/)
Hash and Match APIs
Generally free of charge, these services allow you to call the API and receive a classification repsonse (e.g. “CSAM, likely CSAM, unknown”). You will likely be required to sign binding agreements.
- (Canada) Project Arachnid Shield: https://projectarachnid.ca/en/#shield
- (UK) Image Intercept: https://www.iwf.org.uk/our-technology/image-intercept/
- (USA) Microsoft PhotoDNA: https://www.microsoft.com/en-us/photodna
- (Netherlands) Instand Image Identifier: https://web-iq.com/solutions/instant-image-identifier-to-fight-csam
Standalone Platforms
- Thorn Safer (paid): https://get.safer.io/csam-detection-tool-for-child-safety
- Meta PDQ (open source): https://github.com/facebook/ThreatExchange/tree/main/pdq
- AI Horde csam_checker (open source): https://github.com/Haidra-Org/horde-safety/blob/main/horde_safety/csam_checker.py
ActivityPub Platform-specific
- (Lemmy) A script that goes through a lemmy pict-rs object storage and tries to prevent illegal or unethical content: https://github.com/db0/lemmy-safety
- (Firefish) CloudFlare configuration: https://socialweb.coop/blog/firefish-cloudflare-quickfix-r2-tutorial/
