Astroturfing

Definition

Organized activity intended to create the deceptive appearance of broad, authentic grassroots support or opposition to a given cause or organization, when in reality the activity is being motivated, funded or coordinated by a single or small number of obscured sources.

Related Terms

Sock PuppetsCoordinated Inauthentic Behaviour, Influence Operations, Stealth Marketing

Background

Astroturfing in the Fediverse refers to the practice of creating a deceptive appearance of widespread, organic support within a community for a specific idea, product, or narrative. The decentralised architecture of the Fediverse presents unique challenges in addressing this behaviour. Campaigns can originate from, or be disseminated across, numerous distinct communities, making it difficult for any single volunteer moderation team to ascertain the full scope of such activities.

Due to the autonomy of each community, policies and enforcement approaches may vary significantly, often being determined by the respective service provider. Consequently, a service provider might host accounts engaging in astroturfing – either knowingly or unknowingly – whose content then federates into your community. Volunteer moderators typically operate with limited tools and time, relying substantially on manual detection and inter-community communication, which may lack consistency or immediacy.

Why We Care

Dealing with astroturfing matters because it harms the trust levels and open feel of your community. These fake efforts can unfairly steer conversations, so genuine opinions and different ideas get buried. When interactions seem fake, people naturally start to distrust what they see, and your community can end up feeling less friendly and vibrant.

If astroturfing goes on, outside groups may try to use your community to push their own agendas. This makes for worse discussions and can damage the genuine spirit of your community.

Spotting Astroturfing: What to Look For

Identification of astroturfing involves observing patterns indicative of coordinated, inauthentic activity rather than genuine engagement.

Account Traits: Be alert to a sudden proliferation of new accounts, frequently with underdeveloped profiles (e.g., lacking an avatar, utilising default biographical information), particularly if these accounts uniformly promote an identical link, hashtag, or viewpoint. Similarities in account creation dates or naming conventions may also be significant.

Content Characteristics: Examine content for identical or nearly identical posts and comments originating from multiple, distinct accounts. Repetitive use of specific phraseology, talking points, or URLs is a common indicator.

Posting Patterns: Observe posting behaviours for anomalies, such as concentrated bursts of activity around a specific topic emanating from these accounts, often occurring in rapid succession or at coordinated intervals. Atypically high boosting or favouriting patterns for specific content from seemingly disconnected accounts can also suggest manipulation.

Behaviour: Assess the general conduct of the accounts in question: do they exclusively post about a singular topic or product, abstaining from broader community discussions? Do accounts aggressively defend a particular viewpoint with homologous arguments across numerous threads?

Key Questions for Assessment:

  • “Does the observed activity align with typical engagement patterns within our community?”
  • “Are these ostensibly distinct accounts engaging in authentic, reciprocal interactions, or primarily echoing a centralised message?”
  • “Is there an abrupt, inexplicable consensus on a previously niche or contentious topic?”

Before You Act: Common Pitfalls & Nuances

It’s important to tell the difference between astroturfing and real excitement from regular people or honest support for a cause.

Genuine Enthusiasm: A new popular game, software project, or social movement can naturally lead to many new accounts joining and excitedly talking about it. These accounts usually have more diverse profiles and talk about more things over time.

Coordinated Advocacy (Transparent): Real groups might plan to tell people about something. The main difference is they are often open about who they are and what they want, instead of trying to trick people.

Common Gotchas:

  • Thinking that a small group of people who are excited about something is an astroturfing campaign.
  • Depending too much on accounts just being new – new accounts join all the time.
  • Thinking that several accounts agreeing strongly is astroturfing, without proof they are working together or aren’t real.

Key Point: Above all, look for actual deception. See if accounts appear to be working together to create a false impression of being separate, natural members of the community. Your goal is to spot when something feels orchestrated and fake, not just when many accounts happen to share the same genuine opinion.

Managing Suspected Astroturfing: Key Steps

When you think astroturfing might be happening:

  • Observe and Gather Evidence: If it’s not causing big problems right now, watch the suspected accounts. Collect links to posts or take screenshots that show the worrying patterns before you do anything.
  • Discuss with Team (if applicable): If you are part of a moderation team, share what you saw and the proof you have with other moderators or your Service Administrator to get a second opinion and ensure a consistent approach.
  • Assess Scope and Intent: Figure out if it’s just a few accounts or part of a bigger campaign. Think about whether the goal is clearly to trick people.
  • Apply Community Guidance: Use your community’s rules clearly and fairly. Decide which of your rules (like a warning, strike, or ban) is right for how bad it is and what proof you have.
  • Consider External Communication Carefully: If a campaign clearly starts in another specific community, your Service Administrator might consider informing the leadership of that community. This should be done with caution, focusing on sharing evidence of the behaviour, not making accusations.

Example Community Guidance

Strike System: “Engaging in coordinated inauthentic behaviour to artificially amplify content will result in an official warning and is considered a strike. Further violations leading to three strikes will result in account suspension.”

General Prohibition: “Deceptive amplification of content or discourse is prohibited.”

Strict Enforcement: “Systematic astroturfing campaigns will lead to immediate and permanent account bans.”

Further Reading


IFTAS
IFTAS
@about.iftas.org@about.iftas.org

Nonprofit trust and safety support for volunteer social web content moderators

44 posts
273 followers

Community Responses

IFTAS is a non-profit organisation committed to advocating for independent, sovereign technology, empowering and supporting the people who keep decentralised social platforms safe, fair, and inclusive..