Federation Abuse Policy

From neuromatch
Short Description Conditions that govern decisions about defederation and other instance-level moderation actions.
Contributor(s) Manisha, Jonny
Approval Status Draft
Completion Status Draft
Loomio Thread(s)
Date Proposed
Date Approved
Topic(s) Federation, Defederated Instances


Preface

To support a healthy and safe online space, Neuromatchstodon members follow our Code of Conduct. While we cannot, and don't seek to control the behavior of people outside the instance, this policy is intended to keep our members safe and uphold our collective values by describing when we will take instance-level action to block or otherwise moderate interactions with other instances.

We seek to balance individual autonomy with mutual responsibility in determining who we interact with on neuromatch.social. Defederation can be a drastic action that splits communities, and is a relatively blunt tool to stop abuse. At the same time, the safety of our fellow members is non-negotiable, and defederation is one way of avoiding the double marginalization of already structurally marginalized people needing to fend for themselves.

These guidelines reserve some clear actions like defederating from hate-based instances to the discretion of the Social WG while providing guidance for Proposals for other cases that require discussion. Per our Governance policy, any member may make a proposal if they disagree with the discretion of the Social WG or wants to prompt their action.

Individual Moderation

These guidelines govern actions taken either by the instance or an individual member against an account or person on another instance.

Self-Moderation

We encourage conflict resolution when possible, including peer mediation --- don't be shy to ask another member to help mediate a dispute if you think it would be helpful.

Beyond that, members should make liberal use of the tools at their disposal, including

You shouldn't feel like you're on your own, and safety is a collective responsibility, so these user-level actions are a first step, not a way of individualizing the problem of moderation. If you are comfortable doing so, Posts or accounts that violate our Code of Conduct should always be reported so we can keep an eye on patterns of behavior across accounts and instances.

Global Moderation

If the Social WG receives a report and determines either the post or account violates out Code of Conduct, the moderators on duty may do one of the following, depending on severity and if the behavior is part of a pattern:

  • Warn the user via DM if the violation is minor and they are otherwise in good standing in the community
  • Mark as sensitive in the case that the account has repeatedly posted content is not against our CoC, but should be marked with a content warning per prevailing norms. When an account is marked as sensitive, all media they post is hidden by default.
  • Limit the account if the content is not violent or immediately harmful and it is the first instance of violating the CoC. A limited account's posts can only be seen by its followers, but they can still interact with members normally.
  • Suspend the account if the violation is severe or harms our members. Suspended accounts cannot interact with anyone on our instance, and their posts, content, and profile are deleted.

In any case, the moderation team should take into account any preferred outcome the reporting member might have, and the member is welcome to work with the moderation team on any further actions. The moderation team should not take any action that will put the member at further risk, including inaction after a member has expressed a need, or escalating a situation when the member has said doing so might cause them additional harm.

All individual moderation actions might include deleting the reported posts.

All moderation actions notify the affected account and provide opportunity for appeal. We support the principles of restorative justice, which emphasizes giving people a chance to correct their mistakes and be reintegrated into a community, but only at the request or with the consent of the affected member(s).


Instance Moderation

These guidelines govern actions taken at the instance level against other instances.

When defederating or silencing other instances, the moderation team should update the list of Defederated Instances with the instance URL, the reason for the moderation action, and any relevant links to public discussions or posts from the @socialwg account.

Any user may start a discussion or make a proposal to change the moderation status of any instance.

Limiting

An instance may be *limited* - all posts from all members do not appear in any public feeds, and members are prompted to approve follow requests even if their profile is public - in the case that

  • The instance or its moderators operate the instance in a way that is substantially at odds with the values articulated in our Bylaws, Code of Conduct and any prevailing norms AND
  • Does not have a pattern of users or content that violates our CoC, endangers our members, or causes undue moderation burden AND
  • Has significant interaction with our members.

Despite being less severe, members and moderators should raise the issue of limiting an instance as a discussion in most cases, since the above conditions make it likely to be a moderation edge case or for there to be substantially mixed opinion among the membership.

Defederation

The moderation team can, at their discretion, defederate from instances that promote or take no action against accounts that are involved with any of the following:

  • Harassment, hate, doxing, or violence towards our members or others. [1]
  • Block evasion or other attempts to avoid the results of a moderation action or consensus-based decision.
  • Advertising or other systematic unwanted contact with members of ours or other instances.

The moderation team SHOULD, and any member MAY initiate a discussion leading to a proposal for defederation of instances involved with the following:

  • Corporate capture - instances run for-profit or by for-profit entities will be treated with a presumption of defederation.
  • Surveillance Capitalism, or bulk harvesting of data from ours or other instances without explicit consent. In the case bulk harvesting is occurring through our public API, the Tech WG should block those IP addresses.
  • Hosting multiple accounts that are the subject of multiple reports without taking any corrective action, or without appropriate communication with the moderation team.
  • Hosting bots that repeatedly fail to respect the NoBot and NoIndex hashtags


See Also

Footnotes

  1. Violence is difficult to define and the moderation team should take context into account, including differences in power, position, and status. Within (US) law, talking about burning down a police station is allowed, talking about burning down a similarly situated person on the fedi's house is not allowed.