Table of Contents
The information provided by IFTAS is for general guidance and informational purposes only. It does not constitute legal advice and should not be relied upon as such. IFTAS is not a law firm and does not offer legal services. For advice on legal or regulatory matters, please consult a qualified professional.
Background
Australia’s Online Safety Act 2021 is the country’s primary law for keeping people safe online. It gives the eSafety Commissioner strong powers to protect users, especially children, from harmful online content and behaviours. The Act applies to a wide range of online services, including social media platforms, messaging services, and websites that host user content.
The law sets out rules and responsibilities for online service providers, including:
- Protecting children and young people from serious online harms, including cyberbullying and harmful content.
- Removing illegal and unsafe material such as child sexual abuse material, violent extremist content, and image-based abuse (non-consensual intimate images).
- Responding to takedown notices from the eSafety Commissioner within strict timeframes (often 24 hours).
- Providing safety measures and tools such as user reporting, blocking, and complaint-handling systems.
- Following industry codes and standards that set minimum protections against harmful content.
From December 2025, the Act also introduces a minimum age of 16 for social media accounts. Platforms must take reasonable steps to:
- Prevent people under 16 from creating accounts.
- Close accounts if they identify underage users.
- Use age-assurance measures that are appropriate for the size and scale of the service.
The Act is designed to balance safety with fairness: requirements are enforced proportionately, so smaller community-run services are expected to demonstrate policies, transparency, and responsiveness, while large global platforms face stricter obligations and reporting.
Under the Act a social media service is one where:
- the sole purpose, or a significant purpose, of the service is to enable online social interaction between two or more end-users
- the service allows end-users to link to, or interact with, some or all of the other end-users
- the service allows end-users to post material on the service.
Fediverse services generally meet all of these conditions.
Excluded categories
The law excludes:
- services that have the sole or primary purpose of messaging, email, voice calling or video calling
- services that have the sole or primary purpose of enabling users to play online games with other users
- services that have the sole or primary purpose of enabling users to share information about products or services
- services that have the sole or primary purpose of enabling users to engage in professional networking or professional development
- services that have the sole or primary purpose of supporting the education of users
- services that have the sole or primary purpose of supporting the health of users
- services that have the sole or significant purpose of facilitating communication between educational institutions and students or student families
- services that have the significant purpose of facilitating communication between health care providers and people using those services.
Scale and “small services”
Three tiers of risk are identified in the SMS Code.
- Tier 1: has over 3 million Australian end-users and over 30 million users globally
- Tier 2: has between 500,000 and 3 million Australian end-users (and 5-30 million users globally)
- Tier 3: has less than 500,000 Australian end-users (and less than 5 million worldwide)
(For the full expectations, see Summary of Reasons – Social Media Services Code)
It’s important to note these are “broad expectations”, and Tier 3 services have additional expectations that most federated services do not meet, meaning most will in fact be Tier 2 due to the functionalities involved, but they will be very small Tier 2 services
It appears relatively likely that eSafety has little desire to actively enforce compliance on small Fediverse servers if they follow the guidelines below. eSafety has named Facebook, Instagram, TikTok, Snapchat, X (Twitter), YouTube, Reddit, Twitch, Kick, and Threads as their initial focus. Roblox, Pinterest, YouTube Kids, Discord, WhatsApp, Lemon8, GitHub, LEGO Play, Steam and Steam Chat, Google Classroom, Messenger, and LinkedIn are out of scope for now. As a comparator for ActivityPub services, Bluesky has been assessed as very low risk due to only having very few Australian users and even fewer young people. (This is not an exemption, but they are currently not a target for enforcement.)
Minimum Age Enforcement (from Dec 2025)
- The Act sets a minimum age of 16 for social media accounts.
- Applies to all social media services, including federated platforms like Mastodon.
- A small instance is still expected to:
- Have reasonable age-assurance processes (these do not have to be intrusive or expensive, but some mechanism is needed).
- Deny accounts to users under 16.
- Take “reasonable steps” to close accounts if underage use is identified.
Proportionality: eSafety guidance suggests smaller services will not be expected to build complex age verification systems like big tech, but they will need clear policies and a reasonable method of enforcement.
Removal of Harmful Content
The eSafety Commissioner has powers to issue takedown notices across all services. For Fediverse services this means:
- Cyberbullying content targeting children must be removed within 24 hours of notice.
- Image-based abuse (non-consensual intimate images) must be removed within 24 hours.
- Class 1 material (e.g. child sexual abuse, pro-terror content) must be removed as soon as reasonably possible.
- Federated services must have a way for users to report harmful material.
Industry Standards & Codes
The Act allows eSafety to set industry codes of practice. For social media, this currently covers:
- Minimising the spread of harmful content.
- Reporting and complaints systems.
- Tools for users to block, mute, or report.
Most Fediverse platforms already have many of these features built in, but small instances should ensure they are enabling user reporting of posts/accounts, documenting moderation processes clearly, and following eSafety takedown notices promptly.
Transparency & Record-keeping
Large platforms have strict reporting obligations. For smaller services, formal annual transparency reporting is most likely not required, but the eSafety Commissioner expects record-keeping on complaints, moderation actions, and responses to takedown notices. This helps demonstrate compliance if challenged.
Liability and Responsibility
- The person or entity running the server is treated as the provider of the service. This means that even volunteer-run or community-based services are considered responsible under the Act.
- Federation does not change liability, each instance is responsible for its own hosted content.
Practical Steps for Fediverse providers offering service to Australians
- Publish a clear Terms of Service with:
- A stated minimum age policy (16+)
- Rules that prohibit content described by the Act (Child sexual abuse material; Terrorism; Extreme pornography and sexual violence; Promoting or facilitating suicide, self-harm, or eating disorders; Inciting violence or criminal acts; Drug and weapon trafficking; Fraud.
- Guidance on the reporting functionality for your end users
- Set up moderation workflows:
- Be able to demonstrate the ability to action takedown notices quickly
- Document who handles reports and how decisions are made
- Keep simple records:
- Complaints received, actions taken and timelines, any takedown requests from eSafety
- Prepare for age-assurance:
- Create a short policy statement on how under-16 users will be prevented or removed (if you are running a service that is inappropriate for children)
- Lightweight measures such as self-declaration and community reporting may be acceptable for very small services
If you receive any takedown or other communication from eSafety, we may be able to help you find local resources, contact us.
