Table of Contents
Definition
Organized online activity where an account or groups of accounts including “fake” secondary accounts (which exist solely or mainly to engage in such campaigns) act to mislead people or fraudulently elevate the popularity or visibility of content or accounts, such as mass-following an account to raise its clout.
Related Terms
CIB, Astroturfing, Sock Puppetry, Influence Operations, Platform Manipulation, Inauthentic Amplification, Fake Engagement.
Background
Coordinated inauthentic behaviour refers to the orchestration of actions across multiple accounts – often across multiple platforms – to manipulate public opinion or disrupt conversations. This is often marked by simultaneous posting of similar messages, artificially amplifying certain viewpoints. Moderators should look for signs of synchronisation among users that don’t appear to act independently. Care must be taken not to confuse genuine community movements or campaigns with inauthentic coordinated efforts.
Coordinated Inauthentic Behaviour in the Fediverse refers to concerted efforts where multiple accounts (often including fake or “sock puppet” accounts) work together in a deceptive manner to manipulate online discussions, artificially boost the visibility or popularity of content or accounts, or mislead other users. These campaigns are “inauthentic” because they misrepresent who is behind the activity, or attempt to create a false impression of organic, grassroots support or opposition. Examples include mass-following an account to inflate its perceived influence, using a network of accounts to flood a discussion, or systematically amplifying specific narratives or content.
The decentralised nature of the Fediverse means campaigns can originate from or be distributed across various instances, making comprehensive detection and mitigation a challenge for any single community’s moderation team. Identifying CIB often requires looking for patterns of unnatural coordination that go beyond genuine shared interest.
Why We Care
Coordinated Inauthentic Behaviour undermines the integrity of online discourse and erodes trust within the community. CIB can create false impressions of public opinion, drown out authentic voices, unfairly harass individuals or groups, or illegitimately promote specific agendas, products, or individuals. This manipulation distorts the information environment and can make it difficult for users to discern genuine sentiment from manufactured campaigns.
Addressing CIB helps to maintain a fair and authentic space for discussion and interaction, ensuring that visibility and influence are earned through genuine engagement rather than deceptive tactics.
Spotting Coordinated Inauthentic Behaviour: What to Look For
Identification of CIB involves observing patterns of activity across multiple accounts that suggest deliberate, deceptive coordination rather than organic interaction.
Account Traits: Look for networks of accounts, often newly created or with sparse, generic profiles, that exhibit similar characteristics or appear to be controlled by a limited number of actors. Some accounts in a CIB network might be established but behave out of character when participating in the coordinated activity. “Fake” secondary accounts often exist solely to support the campaign.
Content Characteristics: The accounts involved in CIB often share or promote the same or very similar content, narratives, links, or hashtags in a synchronised manner. The content might be low-quality, repetitive, or designed primarily to amplify a specific message or target.
Posting Patterns: Observe synchronised activity, such as multiple accounts posting, reboosting, liking, or commenting on specific content or accounts within a short timeframe. This includes mass-following of specific accounts to artificially inflate their follower counts. Accounts might activate in concert around specific events or discussions. There might be an unnatural ratio of amplification (e.g., reboosts, likes) to original content or replies from the participating accounts.
Behaviour: The core of CIB is the coordinated and inauthentic nature of the behaviour. Accounts involved may not engage in genuine discussion beyond the scope of their campaign objective. They might use similar talking points, deflect criticism in unison, or use tactics to evade detection, such as slight variations in posting times or content. The overall impression is one of an orchestrated effort rather than spontaneous individual actions.
Key Questions for Assessment:
- “Are multiple accounts acting in a highly synchronised manner to promote or attack specific content, accounts, or narratives?”
- “Do these accounts appear to be genuinely independent, or are there indicators (e.g., shared characteristics, similar timing, content parallelism) suggesting they are part of a coordinated network?”
- “Is the primary purpose of this activity to mislead others about the popularity or origin of the content/sentiment, or to artificially amplify its reach?”
- “Does the behaviour involve the use of fake accounts or accounts primarily dedicated to this type of coordinated activity?”
Before You Act: Common Pitfalls & Nuances
It is important to distinguish CIB from genuine, organic coordination or enthusiastic community responses.
Genuine Community Mobilisation: Legitimate groups or communities may organically coordinate to share information, support a cause, or respond to an event. The key difference is usually transparency about their affiliation and the authenticity of the participating accounts.
Viral Content: Content that naturally goes viral will see widespread, rapid sharing from many independent accounts. This is distinct from the manufactured amplification seen in CIB.
Shared Interest Groups: Users with shared interests will naturally discuss and amplify similar topics. This lacks the deceptive or manipulative intent of CIB.
Common Gotchas:
- Mistaking genuine grassroots activity or trending topics for CIB.
- Focusing too narrowly on individual accounts rather than looking for the broader network patterns of coordination.
- Lack of access to cross-instance data, which can make it harder to see the full scope of a CIB campaign operating across the Fediverse.
Key Point: CIB is characterised by deceptive coordination with the intent to mislead or manipulate. Evidence should point to an organised, inauthentic effort rather than genuinely independent actors with similar views.
Managing Suspected Coordinated Inauthentic Behaviour: Key Steps
When Coordinated Inauthentic Behaviour is suspected:
- Observe and Document: Monitor the suspected accounts and their interactions. Collect evidence of coordinated timing, shared content, similar messaging, and any indicators of inauthenticity (e.g., profile similarities across accounts).
- Identify Network Patterns: Look for connections between accounts. Are they all following the same set of accounts, interacting primarily with each other, or being directed by a central account or external source?
- Assess the Scale and Impact: Determine the extent of the behaviour and its effect on discussions, content visibility, or community members.
- Discuss with Team (if applicable): Share findings and evidence with fellow moderators or your Service Administrator. Analysing CIB often benefits from collaborative review.
- Service Administrator Escalation: Service Administrators may have access to tools or data (e.g., IP logs for local accounts, federation patterns) that can help identify connections between inauthentic accounts within their instance or suspicious patterns from other instances.
- Inter-Community Communication (Cautiously): If CIB appears to be operating from or affecting multiple instances, Service Administrators may consider sharing anonymised patterns or specific concerns with administrators of other affected/involved instances.
- Apply Community Guidance: Once CIB is reasonably confirmed, take action against the involved accounts according to your policies. This often involves suspending or banning the accounts participating in the inauthentic campaign, especially fake accounts.
- Focus on Transparency (where appropriate): In some cases, after taking action, it might be appropriate (if policy allows) to inform the community generally about the detection of and action against manipulative behaviour, without revealing specific private data.
Example Community Guidance
Strike System: “Minor or isolated instances of suspected inauthentic coordination might receive a warning. However, clear participation in broader campaigns will result in more severe sanctions.”
General Prohibition: “Engaging in Coordinated Inauthentic Behaviour, including the use of fake accounts or networks of accounts to deceptively manipulate conversations, artificially amplify content or accounts, or mislead users, is strictly prohibited.”
Strict Enforcement: “Confirmed participation in campaigns of Coordinated Inauthentic Behaviour will lead to the banning of all identified participating accounts. We are committed to fostering genuine interaction and will take action to protect our community from manipulative practices.”
