1 - Moderation Actions and Appeals Process
Information about the different actions that moderators take and what to do if you’ve been moderated.
Here you will find everything you need to know about what to do if you’ve
been moderated, including:
- The specific actions we take and what they mean or imply
- How to appeal and how we work around limitations with the appeal process
First things first
We acknowledge that being moderated can be stressful. We do our best to
intervene only when necessary and in the interest of preserving Hachyderm as
a safe space. We acknowledge that we are human and that we can make errors.
We ask for your patience and understanding that when we approach a
situation, we are doing so as strangers moderating strangers.
For more information about what goes into how we interpret and enforce our rules,
please take a look at our Moderator Covenant and
Rule Explainer. Some of the language used below
will come from the Moderator Covenant in particular.
Overview
Although there are a few actions we can take as moderators, the most common
are:
Two additional, less commonly used, actions are:
Warn
The warning feature is a tool in the Mastodon admin tools that allows us
to send a message directly to you.
When you receive a warning, that means that the moderation team has decided:
- The impact of what you were reported for did not require a more significant
intervention, and …
- …based on the interaction(s), we believe that you will respond to the
warning with a growth mindset.
Essentially, warnings are the feature that allow us to respond to a report
with a gentle nudge in the right direction. Warnings do not change your
login, use, or ability to federate as a Hachyderm user.
It is important to understand that, unlike some other systems, a “warning”
is not something that is tracked for later punishment. This means that you
are not accruing “strikes”, or similar, with every warning you receive.
That said, please do keep in mind that we choose our actions to prevent
recurrence of the same actions that have caused the community harm. So while
there is not a strike system, this is not leeway to continue to do what you
were reported for without change.
When the moderation team sends a warning, we always send a message with
what we need from you. This may be a reminder to leave interactions that
are not going how you hope or intend, or it may be a reminder to disengage
if you’ve been asked to leave a conversation for whatever reason.
Freeze and Suspend
These are both actions that prevent you from using your Hachyderm account
normally.
- Freeze
You will be able to log in, but will only be able to access your
account settings with a message that your account is still otherwise
online. Basically this means that to an outside observer your account
is still “up”. - Suspend
When your account is suspended it will be taken offline. Your data
including media, posts, followers, and following will all be intact
for 30 days. After 30 days, they are automatically purged from the
server after that time.
In all except the most egregious situations, whether a moderator suspends
or freezes an account is based on what we are seeing in the reported
interaction. Specifically:
- How severe is the impact of the reported interaction to our
community and other communities
- Is the situation continuing to escalate, or at risk of doing so
- Which action, freeze or suspend, reduces the risk to our community and others
If your account was frozen, you should always file an appeal. If your account
was suspended, the situation of the suspension will determine if an
appeal is possible. (Skip to The Appeals Process)
We want to call out that there are times that we temporarily suspend an account
even if the rule broken is not severe. This would only happen when there is an
impact to the server or community that would warrant and benefit
from an immediate, visible, action and forced cool-down period.
Limit and Delete Post
Being limited means that you are hidden on our instance. Your posts
are all still visible to your followers and can still be discovered
when directly searched for. Your posts will not otherwise show up
in the Local feed. You will be able to federate with other instances
normally.
Delete Post is what it sounds like - we can delete the reported posts
and associated media (if that media is uploaded to Hachyderm).
We do not typically Limit accounts or Delete Posts. We have a couple of
reasons for this:
- We do not want the moderation process to be passive; essentially,
if you acted in a way that required intervention, we want to see
that you are willing and able to rectify the situation without
further intervention.
- In the case of Limit in particular, we do not want the process
to go on without resolution.
This means that, in general, we will use one of the other actions and
communicate what we’d need from you in an appeal process to reinstate
normal functioning of your account.
Reminder: not all moderation action is visible
Of the above actions, the only moderation actions that are visible
are if a moderator deletes a post or suspends an account. When an
issue is closed without action or when a user is warned, frozen, or
limited, the action is not visible to the reporting user or other
users.
The Appeals Process
You may respond to any moderator action with an appeal. In all cases except for
1 ) severe rule violations or 2 ) creating a banned account type
it is in your interest to file an appeal to start or continue a conversation
about what you reported for.
When not to appeal
The only two situations when filing an appeal will not be helpful. If
the harm done to the community is repetitive (before it was caught) and
the impact and risk to the community is high, there is no benefit to you
filing an appeal. This includes, but is not limited to, being in favor
of systems of oppression, posting illegal content, harassment, etc.
The other situation is if the account is a type that we ban on our
server and it was correctly flagged (please file an appeal if not).
Common types of banned accounts are those that don’t abide by our NSFW Policy and Monetary Policy.
For clarity: if your account was suspended (or frozen) due to being either
1 ) an unrecognized special account type or 2 ) not following the rules
for your account type (bots, companies, etc.) then you should file an
appeal.
When to appeal
If the reason that your account was flagged for a rules violation was
incorrect, you should file an appeal. We again ask for patience and
understanding as we work with you to correct our mistake in that case.
Some cases for something like this:
- An account that is flagged as a company or business (a corporate account),
but is not.
- An account that is a specialized account type, but one that is not directly
allowed from our Account Types. Currently, accounts that are not
specifically called out are not allowed. We request and recommend everyone
interested in creating accounts that are in a grey area to reach out to
us at admin@hachyderm.io.
There are other cases where we use freeze and suspend as tools to help
de-escalate or otherwise resolve a situation in progress, but the freeze
or suspension does not need to be permanent. Example situations for this:
- A specialized account type not following the rules for that account type.
(e.g. bot posting more than the limit, not using hashtags, etc.)
- If a situation is escalating in a way that cannot be resolved without
stopping what is happening while in progress. (e.g. A user
accidentally spamming the server due to something misconfigured.)
Where possible and applicable, we try to include whatever needs to happen
to unfreeze or unsuspend an account. Commonly, with frozen accounts, we may
need you to agree to delete posts or similar to re-instate your account.
Whenever you file an appeal you should always email us due to limitations
in the appeals process. (See next section.)
Limitations of the appeals process
Whenever you want to file an appeal you should both file the appeal and
email us at admin@hachyderm.io.
The reason for this is that the Appeal feature in the admin interface
does not allow us to continue a conversation with you. When we receive an
appeal we can only Approve or Reject. We cannot send an outbound
communication when we do either action and cannot send a communication
to help us decide what action to take.
In order for us to know if your account was flagged incorrectly and why,
or for us to be able to reinstate your account if it was frozen or suspended
pending action we requested of you, we will need you to email us at
admin@hachyderm.io. Please also include a
summary of the situation in the appeal itself, as that will remain
tied with the appeal in admin UI, which will set the initial context
and set the expectation that there are emails corresponding to the appeal.
2 - Blocklists
Community information about Blocklists and how Blocklists are built.
The Basics of Blocklists
General Purpose
Blocklists are one way that a Mastodon instance can handle unwanted
content on the instance level. When a Mastodon domain is on the Blocklist,
this means that the server administrators have limited or completely
suspended activity with the server at that domain. The specific actions
are:
- Suspend
This is the most commonly known one, as it prevents all
instance-to-instance activity. - Limit
Previously known as “silencing”. This means that accounts on
the home server can follow accounts on the limited server, but that general
content from that server does not show up on the Federated timeline. - Reject Media
Media from the moderated server will not display on the
home server, this includes not only media in the posts themselves but
avatars and headers as well.
Mastodon has these features in their documentation.
Moderator vs User Actions
Users can also take individual action to prevent themselves from seeing
unwanted content by blocking or limiting other accounts on the user level.
This is recommended for cases where the content is perhaps unwanted by
an individual user, but that content does not violate the home server’s
ethos.
Mastodon has additional documentation about actions that individual
users can take on their Dealing With Unwanted Content documentation page.
How Hachyderm’s Blocklist is Built
At Hachyderm, we do our best to balance what actions should be taken at the
instance level and what should be handled at the user level - both when it
comes to our own users and when we receive reports of users on other
instances or the instances themselves. The vast majority of the moderator
action we have taken on servers on our Blocklist is to either
silence/limit or suspend/block. The domains that are included on this list are:
- Curated from a variety of published Blocklists on other Mastodon instances
- Added based on user reports. This can mean either:
- A domain has been reported, researched, and moderated individually, or
- A high volume of reports regarding several users on an instance have led
to an instance being researched and moderated
Although suspension is the most well-known and discussed moderation action,
domains may be limited as well. For example, we might limit a server that
has several bots that are taking over the Federated timeline but not
suspend so users can continue to follow individual accounts.
Concerns that go into building the Blocklist
There are a few top level concerns that go into determining adding servers
to the Blocklist:
- Does the ethos of a specific server violate the ethos of this server?
A few
easily understood examples would be if a server is anti-Black, anti-queer,
or endorses either direct or dog-whistled hate speech and content. - Is there a security concern around this server?
As a broad example,
if we see malicious traffic coming from a specific server or servers. - What risk does the server pose to our users?
We prioritize the safety and
experience of our users that are in historically underrepresented
groups. - What do our user reports look like? Are they user level on a given
server or are they indicative of malicious patterns server-wide?
- What level of moderation is needed? Limit/silence or block/suspend?
It is our opinion that it is in our users’ best interest to federate with
as much of the Fediverse as possible so that we can all share our joys,
sorrows, growth, learning, etc. with each other.
Our goal with the maintenance of the Blocklist is to ensure that all of
our users are safe on Hachyderm. That means when we move forward with taking
moderation action on a server, that we will take the best course of
action to ensure that safety. We will prioritize the safety of our
marginalized users over the broader experience of a completely open
and unmoderated (at the server level) Fediverse.
When we take moderation action against a server, we consider:
- The items called out in the numbered list above
- Balancing open participation with curating a safe space
Concerns that go into transparency around the Blocklist
Some servers have reasons attached to their moderation action and
others do not. In addition, we may or may not announce when we
limit/silence or block/suspend individual instances. Why is this?
When we choose what level of notification to send, and how
transparent to be, with moderation actions we consider:
- The impact of the change
- The interest level in the change
- The risk of publicizing/being transparent about the change
For example, if we were to take moderation action against
any of the large, popular Mastodon instances we would err on
the side of transparency as this would be a significant and
user impacting event.
Whenever we take action on a server that has malicious activity,
and this can be in the form of attacks on our server or in the
form of social attacks like stalking and harassment, we err on
the side of safety. This means what level of information
we provide and how loudly (notifications, etc.) we provide
it will be based on what is safest for all of our users.
The vast, vast majority of instances fall between these two
extremes and thus the resulting decisions do not fall
perfectly in the “fully transparent” or “completely silent” buckets.
This means:
- Servers may not have reasons attached to their moderation
decisions, but that doesn’t mean they are a security or safety concern.
To do this would immediately out them as a security
concern, by the absence of information. - Not all changes to individual server status will be loudly announced, but some
will.
We will use the decision-making process outlined above when
announcing. - Not announcing that a server has changed status does not
mean that server is or was a cause of concern. It can
also, and frequently will, mean that the server size doesn’t impact
enough of our users to make a large announcement.
Requesting Moderation Changes for a Server
What to do if there is a domain on the Blocklist in error
We are all human and are prone to mistakes. If there is a domain that is
moderated on our Blocklist that seems to be in error, please open a
GitHub Issue in our Community repo
to request that we take another look at the domain. Please include
as much relevant context as you can to help us make our decision.
Note that depending on the circumstances, and as outlined above, we may
not be able to be fully transparent with our decision - but we commit
to erring on the side of transparency with these reports as often as possible.
For more information about how to file a report in our community
repo, please take a look at our Reporting
Documentation.
What to do if you would like us to moderate a server
If there is a server that is not currently moderated, i.e. either
limited/silenced or banned/suspended, then please file a report
via the Hachyderm (Mastodon) UI or GitHub Issue in our Community repo
for us to take a look at that domain or domains. As before, please
include as much context as possible. If there is a concern around the
domain(s) you would like to report that would be risky to report in our
GitHub Issue tracker, please email us at admin@hachyderm.io.
For more information about filing reports and how to choose
between the Mastodon UI and the GitHub Issue tracker, please look
at our Reporting Documentation.
What not to do in either of these cases
There are far, far more Hachydermians than moderators. We do not follow
tags, posts, etc. to make changes to our Blocklist - we only use the sources
outlined above.
3 - Moderator Covenant
How Hachyderm moderators moderate Hachyderm.
This is the set of principles that Hachyderm moderators agree to
inform their decisions and judgment calls when
creating and maintaining Hachyderm as a safe space and enforcing
server rules. This is because first and foremost:
Hachyderm moderators acknowledge the importance of server rules /
Codes of Conduct that are complete and clear. Hachyderm moderators
also acknowledge that the entirety of human behavior cannot be
captured by an itemized list, no matter how many subsections it has, and
therefore use the following principles to ensure that we are
always able to take action even in situations where a reported
infraction “falls between the cracks”.
- We will prioritize the vulnerable.
All actions we take will prioritize the most vulnerable, full stop. - We acknowledge that we will make mistakes.
We acknowledge that we are not infallible. We will constantly be
learning and growing and will respond to our mistakes with
acknowledgement, care, and do what is necessary to undo or
mitigate the harm done. - We will moderate with respect.
We will handle our communications with users and accounts that
have been reported for moderation with respect. - We acknowledge and understand that we are strangers on this pale blue dot.
The reality is that the vast majority of user interactions are
between strangers, even if familiarity increases with time. This
means that in almost all reported situations: the reported user is
a stranger to us as moderators and that the reported user is a
stranger to other users in the interaction (on a personal level,
even if the account is recognizable). This means that while we
will look at the reported user on the surface to try and
understand possible intent, we acknowledge that it is not possible
to use intent or presumed / guestimated intent alone to inform
what moderation action(s), if any, to take. - We will prioritize impact over intent.
Whenever we look into a reported interaction, we look at as much
of the situation we can see. This means we do due diligence on
seeing what, if any, factors are contributing to the situation and
if that situation is escalating or at risk of escalating. Since we
acknowledge that We Are Strangers, that means we are doing this
based on an understanding of people, in general, and the
intersectionalities at play. Regardless of intent or whether
actions and words were purposeful, the targeted or affected person
is still harmed. That’s why it is critical to prioritize impact
and acknowledge the harm that was caused. - We will trust, but verify.
There is a saying that you need to believe someone when they tell
you who they are. Individuals and communities make use of the reporting
feature to tell us about other individuals and/or communities who have
announced who they are in some way so we can take
appropriate action. There is also the rare occasions where
individuals will use the reporting feature(s) as a vector of
harassment or oppression against a targeted user and/or demographic.
We balance these two realities by trusting that reports are
filed with good intention, but verifying every time. - We will hold Hachyderm users accountable for their actions.
This is specific to the moderation context of when a reported user
is a Hachydermian. When we communicate rule violation(s), we will
also communicate what (if any) actions are needed on your part. To
put it another way: if you acted in a way that requires moderator
attention, you must take action to un-require that attention. The
most common pattern here will be asking you to delete problematic
posts or similar. Note that this will not be done in situations
where it comes into conflict with Prioritizing the Vulnerable or
Making Safety the Sustainable State. Also note that sometimes the
action isn’t deleting posts, but changing a behavior. Two common
patterns here are:- Asking a reported user to do some light research into the
topic area that caused them to be reported. Small steps
iterating over time increase our collective knowledge and
our community’s ability to be safe and open.
- Reminding a reported user that they can always walk away
from an interaction that is not going the way they intend.
- We will steward safe spaces to allow for the range of human expression and experience.
Since people are more likely to report negative emotions and
perspectives than positive, this one will be explained by relevant
examples:- We do not moderate people for being angry at systems of oppression
functioning as designed, because that design is traumatic.
- We do not moderate people for existing in public. This includes,
but is not limited to, “acting Black”, “acting gay”, being visibly
a member of a particular religion, and so on.
- We will not create the Paradox of Tolerance.
Whenever there is a choice that needs to be made between the impact of individual actions
and community safety, we will choose community safety. - We will only take moderation action where doing so increases
community safety and/or decreases community risk.
For every report, we do an analysis to determine whether or not taking
moderator action will improve community safety and/or decrease
community risk. If the best action to take is to not react,
then we will not react.
For off server users in particular we also recognize the limits
of what we are able to moderate. Users on the fediverse who did not
agree to our server rules are not subject to them. In these cases
we are solely evaluating what, if any, moderation action will protect
our community and its members rather than evaluating if a user
who never agreed to our specific rules is abiding by them. - We understand that people need space and safety to grow.
We understand that it is impossible for everyone to know
everything, and that includes us. We do not expect our community
to be experts on every fact of life, or experts in every form of
social interaction. - We will prioritize making safety the sustainable state.
We will take actions to prevent users from being reported for the
same, or similar, infractions. - We will take actions to prevent learning at the community’s
expense.
Specifically:- We will proactively learn and grow to prevent our growth as
individuals and moderators from coming at the community’s
expense.
- We acknowledge when a user has been reported specifically
for being harmful in the community, they have already caused that harm. While we
Understand That People Need to Grow, we will not allow that growth
to happen at the expense of the community. That means that when a
user is reported for harmful action(s) and we determine there is a
risk of future behavior, and/or that the user is not displaying a
growth mindset when already prompted, that we will choose
action(s) that Prioritize Making Safety the Sustainable State.
4 - Reporting Issues and Communicating with Moderators
How to report issues and interact with the moderation team.
There are three ways to correspond with the Hachyderm Moderation
team:
In general, the Mastodon UI (i.e. the “report” feature on
Hachyderm.io) is used for reporting specific posts, users, and
domains. The GitHub Community Issue tracker is for other types of
reports as well as raising other questions and conversations with
the Hachyderm Moderation Team. Optionally, users may also send us
information via email if neither Mastodon reports nor the
GitHub Community Issues are appropriate for the conversation.
Expectations
Our server rules still apply when filing a report or otherwise
communicating with the moderation and infrastructure teams.
How and When to use Email
The moderation team should, in general, only be contacted via
email to:
- Supplement a report in the Mastodon UI
- Provide a report or other communication that cannot be in a
public forum, like the GitHub Issue tracker, and cannot be
submitted via the Mastodon UI.
- Request information about creating a specialized account.
In short: please prioritize using the Mastodon UI and/or GitHub
issues as often as possible. That said, if you would need to reach
out to the admin team for any of the above situations or another
grey area, please use admin@hachyderm.io.
How and When to use the Mastodon UI
The Mastodon UI, i.e. what you see when you’re using Hachyderm.io
or your home Mastodon instance of choice, should generally be used
for reporting issues that can be reported via reporting individual
posts. This typically is used for:
- Reporting individual posts but not the user overall
- Reporting a user via their posts
- Reporting a domain via the posts of their users
For information about the report feature, and what we see when you
send us a report, please look at our Report Feature doc page.
What you should include in your report
- Include specific posts where relevant
- This includes if you’re reporting a specific user as an individual or as
as a general representation of a server dedicated to that type of behavior.
- Note that it is important to include posts, where relevant, as the report
feature does keep the posts even if the user / their server admin deletes
the posts or suspends the user’s account. So if a user has been posting
and deleting those posts, we won’t see it by looking at the timeline but
the UI will keep them if you include them.
- Always include the context when you are prompted for Additional Comments,
even if it is obvious. This can be as succinct as “spam”.
- If you are sending us a report for a server violation and the posted
content is not in English, please supply the translation and relevant
context. In many cases, online translation tools can only directly
translate the words but not the commonly understood (or dogwhistled)
meaning. If you run out of characters, please submit the report and tell
us you have emailed us, and email us at admin@hachyderm.io.
- If you are sending us a report with a short / one-word description, please make sure it
correctly captures the situation. If the reported description does not align
with what is included with the report, we will close the report.
- If you are sending us a report of problematic content where the visuals
may be traumatizing in and of themselves, you can choose not to include
the posts but please always include what we will see when we look
at the reported user’s account or the reported server. We have moderators
opt-in to tasks like these when they appear.
What to know about the Additional Comments:
The most important limitation you should know is
that the Additional Comments field has a character limit of 1000
characters (as of this writing). If you need to supply more
context, or the translation takes more than 1000 characters,
please:
- File the report with what you can
- Make sure to leave enough space to tell us there is a
supplementary email
- Email us at admin@hachyderm.io
Please note: if we receive an empty report and cannot see a
clear cause, we will close the report without moderator action.
Limitations of the Mastodon Admin Interface
When we receive a report, we cannot follow up with the reporting user to ask for
additional information using the admin tools.
How this impacts you:
If you are reporting an issue and do not include enough
information and/or a way for us to get in touch with you to
clarify, we might not be able to take the appropriate action.
So please do make sure to include posts as needed, comments and
context, and email us at admin@hachyderm.io
as needed.
How and When to use the GitHub Issue Tracker
The Community’s GitHub Issues,
a.k.a. Issue Tracker, is for communicating with the moderation
and infrastructure teams, as needed. To create an issue:
- Go to github.com/hachyderm/community/issues
- Click on “New Issue” in the upper right side.
- Select one of the issue templates that applies to you / your situation
- Enter the information needed on the Issue. Depending on the template, there
may be some prompts for what information should be included.
The Community Issues can still be used to report domains, as you would do in the UI.
It can also be used to request emoji, report a service outage (you can also use
omg.hachyderm.io for this),
request updates / changes to the docs, and so on. There are issue
templates for the most common issues that prompt users for the information
we need to respond to requests efficiently. Depending on the
nature of the request / discussion, a member of the infrasture
team and/or the moderation team will respond.
5 - Exceptions and Rule Changes
Steps to take to request an exception to an existing rule or a rule change.
The Rules
In accordance with the Moderator Covenant:
Hachyderm moderators acknowledge the importance of server rules / Codes of Conduct that are complete and clear. Hachyderm moderators also acknowledge that the entirety of human behavior cannot be captured by an itemized list, no matter how many subsections it has, and therefore use [a set of] principles to ensure that we are always able to take action even in situations where a reported infraction “falls between the cracks”.
This means that we will always do our best to take action on reports, but we also acknowledge that there are situations where the rules may not be clear or may not apply. In these cases, we will do our best to take action that is in the spirit of the rules and the spirit of the Hachyderm community. In addtion, we will work to expand and clarify the rules as the community grows and the need change.
Exceptions
If you would like to request an exception to the rules, please email us at admin@hachyderm.io with the following information:
- Your username
- A link to or description of the rule that you are impacted by
- A description of why you believe you are an exception
Rule Changes
There are two routes for changing rules:
- Create a new issue in the Hachyderm Community’s GitHub Issues
- Select “Documentation Request,” if you believe that the documenation associated with a specific rule needs to be updated
- Select “General Suggestions,” if you believe would like to propose a new rule or a modification to the rule
- Create a pull request with the proposed update
- When viewing the documentation, click the “Edit this page” link in the right-hand menu of the page (only visible on desktop)
- Make the changes you would like to see
- Commit the changes to your fork
- Create a pull request with the changes