Moderator Covenant

How Hachyderm moderators moderate Hachyderm.

This is the set of principles that Hachyderm moderators agree to inform their decisions and judgment calls when creating and maintaining Hachyderm as a safe space and enforcing server rules. This is because first and foremost:

Hachyderm moderators acknowledge the importance of server rules / Codes of Conduct that are complete and clear. Hachyderm moderators also acknowledge that the entirety of human behavior cannot be captured by an itemized list, no matter how many subsections it has, and therefore use the following principles to ensure that we are always able to take action even in situations where a reported infraction “falls between the cracks”.

  1. We will prioritize the vulnerable.
    All actions we take will prioritize the most vulnerable, full stop.
  2. We acknowledge that we will make mistakes.
    We acknowledge that we are not infallible. We will constantly be learning and growing and will respond to our mistakes with acknowledgement, care, and do what is necessary to undo or mitigate the harm done.
  3. We will moderate with respect.
    We will handle our communications with users and accounts that have been reported for moderation with respect.
  4. We acknowledge and understand that we are strangers on this pale blue dot.
    The reality is that the vast majority of user interactions are between strangers, even if familiarity increases with time. This means that in almost all reported situations: the reported user is a stranger to us as moderators and that the reported user is a stranger to other users in the interaction (on a personal level, even if the account is recognizable). This means that while we will look at the reported user on the surface to try and understand possible intent, we acknowledge that it is not possible to use intent or presumed / guestimated intent alone to inform what moderation action(s), if any, to take.
  5. We will prioritize impact over intent.
    Whenever we look into a reported interaction, we look at as much of the situation we can see. This means we do due diligence on seeing what, if any, factors are contributing to the situation and if that situation is escalating or at risk of escalating. Since we acknowledge that We Are Strangers, that means we are doing this based on an understanding of people, in general, and the intersectionalities at play. Regardless of intent or whether actions and words were purposeful, the targeted or affected person is still harmed. That’s why it is critical to prioritize impact and acknowledge the harm that was caused.
  6. We will trust, but verify.
    There is a saying that you need to believe someone when they tell you who they are. Individuals and communities make use of the reporting feature to tell us about other individuals and/or communities who have announced who they are in some way so we can take appropriate action. There is also the rare occasions where individuals will use the reporting feature(s) as a vector of harassment or oppression against a targeted user and/or demographic. We balance these two realities by trusting that reports are filed with good intention, but verifying every time.
  7. We will hold Hachyderm users accountable for their actions.
    This is specific to the moderation context of when a reported user is a Hachydermian. When we communicate rule violation(s), we will also communicate what (if any) actions are needed on your part. To put it another way: if you acted in a way that requires moderator attention, you must take action to un-require that attention. The most common pattern here will be asking you to delete problematic posts or similar. Note that this will not be done in situations where it comes into conflict with Prioritizing the Vulnerable or Making Safety the Sustainable State. Also note that sometimes the action isn’t deleting posts, but changing a behavior. Two common patterns here are:
    • Asking a reported user to do some light research into the topic area that caused them to be reported. Small steps iterating over time increase our collective knowledge and our community’s ability to be safe and open.
    • Reminding a reported user that they can always walk away from an interaction that is not going the way they intend.
  8. We will steward safe spaces to allow for the range of human expression and experience.
    Since people are more likely to report negative emotions and perspectives than positive, this one will be explained by relevant examples:
    • We do not moderate people for being angry at systems of oppression functioning as designed, because that design is traumatic.
    • We do not moderate people for existing in public. This includes, but is not limited to, “acting Black”, “acting gay”, being visibly a member of a particular religion, and so on.
  9. We will not create the Paradox of Tolerance.
    Whenever there is a choice that needs to be made between the impact of individual actions and community safety, we will choose community safety.
  10. We will only take moderation action where doing so increases community safety and/or decreases community risk.
    For every report, we do an analysis to determine whether or not taking moderator action will improve community safety and/or decrease community risk. If the best action to take is to not react, then we will not react.
    For off server users in particular we also recognize the limits of what we are able to moderate. Users on the fediverse who did not agree to our server rules are not subject to them. In these cases we are solely evaluating what, if any, moderation action will protect our community and its members rather than evaluating if a user who never agreed to our specific rules is abiding by them.
  11. We understand that people need space and safety to grow.
    We understand that it is impossible for everyone to know everything, and that includes us. We do not expect our community to be experts on every fact of life, or experts in every form of social interaction.
  12. We will prioritize making safety the sustainable state.
    We will take actions to prevent users from being reported for the same, or similar, infractions.
  13. We will take actions to prevent learning at the community’s expense.
    Specifically:
    • We will proactively learn and grow to prevent our growth as individuals and moderators from coming at the community’s expense.
    • We acknowledge when a user has been reported specifically for being harmful in the community, they have already caused that harm. While we Understand That People Need to Grow, we will not allow that growth to happen at the expense of the community. That means that when a user is reported for harmful action(s) and we determine there is a risk of future behavior, and/or that the user is not displaying a growth mindset when already prompted, that we will choose action(s) that Prioritize Making Safety the Sustainable State.