pics.lo476ni.workers.dev Open in urlscan Pro
188.114.96.3  Public Scan

Submitted URL: http://pics.lo476ni.workers.dev/moderation/4405266071063-100-an-intro-to-the-dma
Effective URL: https://pics.lo476ni.workers.dev/moderation/4405266071063-100-an-intro-to-the-dma
Submission: On August 02 via api from US — Scanned from NL

Form analysis 1 forms found in the DOM

Name: email-formGET

<form method="get" name="email-form" data-name="Email Form" id="email-form" class="search" data-wf-page-id="6440e8b612c387d8c7b70fa0" data-wf-element-id="00839149-f0f0-d3a7-5ab4-9230042a6462" aria-label="Email Form"><input class="search-box w-input"
    maxlength="256" name="field" data-name="" placeholder="Safety Search" type="text" id="Search" required=""><input type="submit" data-wait="..." class="button-search w-button" value=""></form>

Text Content

DownloadNitroDiscoverSafety
Safety
SupportBlogCareers
Download
Back
Safety Center
Overview

Controlling Your Experience
Four steps to a super safe accountFour steps to a super safe serverRole of
administrators and moderators on DiscordReporting problems to DiscordMental
health on DiscordAge-Restricted Content on DiscordTips against spam and hacking

Parents & Educators
What is Discord?Discord's commitment to a safe and trusted experienceHelping
your teen stay safe on DiscordTalking about online safety with your
teenAnswering parents' and educators' top questionsIf your teen encounters an
issueWorking with CARU to protect users on Discord

How We Enforce Rules
Our policiesHow we investigateWhat actions we takeHow you can appeal our
actionsDiscord's Transparency ReportsWorking with law enforcement
Back
Moderator Academy
Overview

Basics
100: An Intro to the DMA103: Basic Channel Setup104: How To Report Content To
Discord110: Moderator Etiquette111: Your Responsibilities as a Moderator151: An
Intro to the Moderator Ecosystem

Setup and Function
201: Permissions on Discord202: Handling Difficult Scenarios203: Developing
Server Rules204: Ban Appeals205: Utilizing Role Colors206: Best Practices for
Reporting Tools207: Server Information and Announcement Channels208: Channel
Categories and Names210: Moderator Recruitment211: Creating Moderation Team
Channels231: Fundamentals of Family-Friendly Servers241: Securing Your Discord
Account

Advanced Community Management
301: Implementing Verification Gates302: Developing Moderator Guidelines303:
Facilitating Positive Environments304: Moderating Safely and Securely310:
Managing Moderation Teams311: Understanding and Avoiding Moderator Burnout312:
Internal Conflict Resolution313: How to Moderate Voice Channels314: Training and
Onboarding New Moderators321: Auto Moderation in Discord322: Using Webhooks and
Embeds 323: Using XP Systems324: Using Modmail Bots331: Community Engagement332:
Fostering Healthy Communities333: Planning Community Events334: Community
Partnerships341: Understanding Your Community Through Insights345: Best
Practices for Moderating Content Creation

Moderation Seminars
401: Transparency in Moderation402: Confidentiality in Moderation403: Sensitive
Topics404: Considering Mental Health in Your Community 405: Practicalities of
Moderating Adult Channels407: Managing Exponential Membership Growth431: Ethical
Community Growth432: Internationalization of a Community441: Community
Governance Structures442: Using Insights to Improve Community Growth and
Engagement443: Ban Evasion and Advanced Harassment444: Managing Interpersonal
Relationships451: Reddit X Discord452: Twitch X Discord453: Patreon X
Discord455: Schools X Discord459: Bringing Other Communities to Discord

Graduate
531: Parasocial Relationships541: The Application of Metaphors in Moderation


Author Credits
Author Credits
Sign up


Community PortalSafety and moderation


SAFETY AND MODERATION

Discord provides tons of tools, resources, and tips so you can keep yourself and
your members safe. Flex your moderation muscles here!


DEVELOPING SERVER RULES

Server rules play an important part in how your users interact with mods and
each other, and ultimately define the culture of your server.


MODERATING SAFELY AND SECURELY

Moderator: the title you give to people who have the responsibility of managing
your chat and community.


MODERATING SENSITIVE TOPICS

Permitting the discussion of sensitive topics on your server can allow users to
feel more at home and engage with their trusted peers on topics they may not
feel comfortable discussing with others.


HANDLING DIFFICULT SCENARIOS AS AN ADMIN

For moderators both new and experienced, enforcing the rules is central to your
role in your respective community.


SAFETY AND MODERATION ARTICLES

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.




DEVELOPING SERVER RULES

Safety and moderation

One of the most important parts of setting up your Discord server is determining
“the law of the land.” In other words, by what rules do you want your server
members to abide? Some of these rules should be based on Discord’s community
guidelines and terms of service, while others are more related to general
internet etiquette or even your server’s specific needs. Once you determine the
general principles under which your server will operate, you will need to
determine how much detail you want to provide in your rules and how to enforce
them.


DISCORD COMMUNITY GUIDELINES AND TERMS OF SERVICE

To summarize, the guidelines govern user behavior on Discord and outline what is
and isn’t acceptable in all communities. While many of these things are common
sense, they should still be incorporated into your rules so that there is a
clear expectation set among members as to how they should behave. Some key
prohibited behaviors include:

 * No doxxing or harassment (especially threats on someone’s life/property) or
   encouraging self harm
 * No spamming, phishing, or attempting to steal another user’s account (broadly
   speaking, one could consider this “no spamming or scamming”)
 * No child porn, revenge porn or gore/animal cruelty anywhere in the server,
   while other NSFW content should be limited to properly marked channels
 * No sharing pirated content

The Discord Terms of Service outlines a few additional caveats to using Discord,
including the following

 * You must be 13 years or older to use Discord
 * Distributing “auto,” “macro,” or “cheat utility” programs as well as
   providing hacked/modded game software through Discord is prohibited.

Furthermore, if you are one of the lucky individuals moderating a Partnered or
Verified server (or hope to have your server partnered/verified in the future),
you will need to consider the additional restrictions imposed on these servers
in the Discord Partnership Code of Conduct.

 * Discriminatory jokes and language related to one’s race, age, gender,
   disability, etc. are prohibited.
 * Content that is explicitly pornographic, depicting sexual acts, or depicting
   nudity is prohibited anywhere on a partnered server.
 * Content that includes discussions of nudity, sexuality and violence is
   considered NSFW content and should occur only in the properly marked
   channels.

Any infractions of the Community Guidelines or Discord Terms of Service should
be reported to Discord Trust and Safety with the relevant message link for
proof.


GENERAL ETIQUETTE

While Discord’s guidelines and terms of use cover more extreme cases of bad
behavior, there is still plenty of other behavior that you may want to consider
regulating on your server. Your server rules play an important part, not just in
determining how your mod team will react to users, but also how your users will
interact with each other, and ultimately your server culture. Therefore, your
rules should strive to create a fun and welcoming environment for people of all
races, genders, and sexualities while balancing community needs with the need
for order. In additions to the rules and guidelines set forth by Discord, you
may want to consider prohibiting the following behaviors:

 * Trolling - Trolling refers to the act of disrupting the chat, making a
   nuisance out of yourself, deliberately making others uncomfortable, or
   otherwise attempting to start trouble.
 * This is a good catch-all rule to have because it enables you to take action
   on those that are “acting out” without doing anything really specific to call
   them out on.
 * Discussing Offensive/Controversial Material - This includes topics such as
   politics, religion, acts of violence, rape, suicide/self harm, school
   shootings, and other serious topics; as well as hate speech including racial
   slurs or derivatives thereof, sexist or homophobic statements, and other
   similar types of behavior.
 * The details of how you define this rule can vary depending on the extent to
   which you feel it is necessary to enforce (for example, if you are following
   the partner code of conduct you may want to also include “ableist slurs” as
   being prohibited.
 * Elitism - Members should refrain from insulting or belittling others based on
   the games or versions of games that they choose to play.
 * This is especially applicable to Discord servers where one game may be
   distributed among multiple regions from a gameplay or version perspective
 * Disrespecting Server Staff - Insulting the server moderators or becoming
   belligerent after being warned.

While measured discussion regarding the reasons for being warned should be
encouraged for the education of the user and general public, at some point it
may be necessary to shut down the discussion or take it to DMs.

 * Incitement - Encouraging the breaking of rules, inciting others to be
   blatantly rude and offensive, or otherwise promoting and/or encouraging
   conflicts between other members.
 * Punishment Evasion - Users should not attempt to evade the consequences of
   their actions such as using an alternate account to bypass restrictions.
 * It is recommended that this be punishable with an instant ban, as this is the
   type of punishment that is most difficult to evade compared to the other
   options available.
 * A user that is evading a ban is generally considered by Discord’s Trust and
   Safety team to be harassment, and can be reported to them for further action.
 * Inappropriate User Profiles - For ease of communication and the comfort of
   those in chat, the profile picture, custom status, and display names (i.e.,
   the name that shows up while reading the chat) of users should be in line
   with the rules of the server. Furthermore, the display name should also be
   easily readable, mentionable, and not attempt to imitate other users or game
   development staff or hoist you to the top of the server online list
 * In effect, this also means that the user profile should be safe for work, not
   contain any offensive or illegal content, and not be used to harass others or
   spam/scam.
 * “Hoisting” refers to using characters like exclamation points to make it so
   that you appear at the top of the online members list. Some people will put
   multiple characters like this in an attempt to be at the top of the online
   list. While it is not always feasible to enforce against users that don’t
   often chat, it is good to have a policy to enforce this if you see someone
   chatting with a hoisted display name. But, if this doesn’t bother you, you
   can remove this provision.
 * Advertisement - Similar but not quite the same as spam, this refers to users
   attempting to promote their own social media/discord servers/other content
   creation channels.
 * Usually, it is good to invite users to talk to a moderator privately if they
   want to advertise something.
 * Speaking languages not understood by the server - Essentially, users should
   be prohibited from communicating in a language outside of the server’s
   official language.
 * This makes it easier for moderators to moderate the server by ensuring that
   they understand the conversations that are happening and prevents users from
   trying to fly under the radar by speaking in languages moderators don’t
   understand.


SERVER SPECIFIC CONSIDERATIONS

Your server will also have needs of its own not covered in the previous two
sections, and that’s ok! It is perfectly normal to set up channel-specific rules
or to even override certain server rules in certain channels. Some examples
include:

 * An artist channel where artists are allowed to advertise their art creation
   profiles (e.g., Pixiv, DeviantArt).
 * An image spam channel where people can flood the channel with images.
 * A “current events” channel where people can discuss some controversial topics
   in a civil fashion.
 * An “on topic” chat where people should specifically only talk about a certain
   thing, compared to an “off topic” chat where general conversation about
   anything is allowed.
 * A game-related friend request channel or “carry” where users that post there
   should be expected to be pinged frequently for assistance in a way that is
   not considered harassment/spam.

You will need to carefully consider your server’s specific needs when coming up
with channel rules or other server-wide rules.


RULE ENFORCEMENT

Establishing your rules is all well and good, but ultimately a moot point if you
don’t enforce them. Moderators should carefully consider how they want to
enforce their server rules. Some possible systems include:

 * Case by Case - Punishment is more subjective and dependent on the nature and
   frequency of transgressions. While this system gives moderators a large
   amount of flexibility, accountability can be difficult due to the lack of
   standards and certain moderators may punish users differently. This may be
   better suited to smaller servers where the moderation tasks tend to be
   leaner.
 * Infraction Based - Similar to a “three strikes and you’re out” system, users
   are punished according to the number of times they break the rules. Users may
   receive a mute after a certain number of warnings followed by a ban. While
   this system is great for accountability, it does not account for the severity
   of the transgressions involved.
 * Points based - If you’d like the accountability of the strikes/infraction
   based system coupled with the flexibility of the case by case system,
   consider a points based rule enforcement structure. In this system, each rule
   is worth a certain number of points based on the importance and severity of
   breaking the rule where moderators can adjust the point value with an
   additional explanation for accountability.

APPEALS

If a user thinks their warning is unfair, they may lash out in chat or in the
feedback channels of the server. It is important to have a way users can easily
discuss their warning with the server moderators. You can visit a more in-depth
explanation on appeals here.


BRINGING IT ALL TOGETHER

From creating your rules to developing a discipline system, you should now have
a set of rules that incorporate the important Discord-wide rules, rules that
make your server a welcoming place, and rules specific to your server to help
things run smoothly. By enforcing these rules clearly and consistently with an
accountable tracking system and transparent appeal process, you should be well
on your way to a server you can be proud of!


MODERATOR RECRUITMENT

Safety and moderation

One of the most difficult aspects of being a Discord server moderator can be
finding new people to help you moderate your community. You need to find
individuals that match in personality, mindset, knowledge, availability, and
most importantly, trust. While much of this differs between servers and teams, I
hope to cover the most important parts as well as giving you an idea of what the
common questions are and what to watch out for.


DO YOU NEED MORE MODERATORS?

The first thing to ask yourself or your moderator team before bringing in
another team member is whether you actually need additional help. For example, a
server that provides technical support or assists members in a more personal
fashion may require more moderators than a community server for a mobile game
even if the number of members is the same. An additional and important factor to
consider is moderator burnout.

Placing too many responsibilities on too few moderators can quickly cause
members of the team to lose interest and feel like they do not have enough time
to commit to moderation. In some cases, it can actually be beneficial to have
too many mods rather than too few, but it is important to remember there is no
perfect number of moderators to have on your server.

The most important thing to keep in mind when recruiting moderators is the
purpose moderators have in your server and the extent to which those duties are
(or aren’t) currently being fulfilled. For instance:

 * Are there time periods in your server where moderators are unavailable to
   answer questions or resolve incidents?
 * Are user reports of bad behavior going unactioned for too long?
 * Is it unfeasible to implement additional automoderator measures to reduce the
   number of incidents?

You may also have specific duties that you require of your moderators beyond
typical server moderation. Be sure to analyze these as well and determine if
there is any shortfall between what your mod team needs to do and what it is
currently able to do.

An example of what this situation might look like is if your moderators might
also be the ones handling technical support requests, or contributing to another
site. If you need people for a specific task it might make sense to create a
separate role for those people.


SELECTION PROCESS

If you decide that you do need to select new mods an important part of
recruiting new mods is having a well-defined selection process. Here are a few
things that you should consider when establishing this process:

 * How are candidates selected? Either recruitment or mod applications
 * How do you vet people that might not be a good fit?
 * How is the current moderation team involved in this decision?
 * What permissions you want to give out to newer staff?


CANDIDATE SELECTION

Selection is a highly subjective topic; some servers accept new members via a
vote, some have rules where the decision has to be unanimous, and others are
more open. In all cases, it’s important to consider that the person is someone
you and your team would be comfortable working with.

There are 3 main ways a candidate might be brought to the mod team for
acceptance that can be mixed and matched, but it’s usually recommended to only
have 1 official method:

 * The most common method in smaller servers is the owner selecting new mods.
   This works in the earlier stages, but as the server grows in number of staff
   it may discourage healthy discussion about that person that may be relevant
   (like behavior in other communities that might show who that person really is
   or things that might hurt the reputation of the mod team as a whole).
 * The second most common is having an application form. While this is effective
   in obtaining the information that you are looking for, it might attract
   people that just want to be a moderator for the power or status. It will also
   change their behavior in the main chat as they know they have eyes on them,
   which might make it harder for the mod team to evaluate how truthful the
   responses to the form were.
 * The main advantage of this is that it would let you modify the form to find a
   person for a specific purpose and might include an optional follow-up
   interview. Be careful about “application floods” and have methods to filter
   low-quality applications out (like captchas, questions about your server
   where if you get it wrong it gets automatically rejected, random questions to
   prevent bots, or small tests like type your timezone in a specific format).
   You can also provide a basic test of competence, if they can’t manage the
   simple elements of the form, they won't manage as mod.
 * Finally, you can have the members of your current moderation team recommend
   users as candidates. As the user is not aware that they are being assessed
   for staff, it means that their behavior in chat is unchanged from their
   typical behavior. With time and observation, this allows you to find the
   answers to the questions that would have been on your form along with extra
   red or green flags. It also allows you to find people that would never apply
   for staff in a form due to reluctance or lack of confidence, but that would
   be a great fit for your moderation team.
 * The drawbacks to this process is that it tends to introduce more work in the
   vetting process, as it’s not as linear as an application form. Additionally,
   it might result in a more limited scope of candidates.


VETTING

An important consideration when selecting moderators is determining if the
candidate is able to do their job effectively. A lot of server owners pick their
own friends to mod their servers, which might result in personality clashes,
unproductive discussions about rules, or inactive mods that can’t meaningfully
contribute to discussions.

A good question to ask yourself is: “Do I trust this person?” For example, you
might not want to suggest anyone that has not been actively helping for at least
a year. During that year you can come to know the person, and it can help you
determine if they are more interested in the good of the community as a whole
and not just a role. You are looking for experience and willingness to improve
the community.

External research can also be beneficial, such as a quick Google search that
could show other communities this person might be involved with or asking for
their experience on the application form. If you find them in moderation roles
in other communities, perhaps it may be worth speaking to the leadership of
other communities the user is involved in to get their opinion on that potential
moderator’s work ethic and general attitude.

Ascertaining someone’s motive for becoming a moderator is another thing to
consider. Think about why they want to be involved in your community in this
way, what is motivating them. Some points to think about include:

 * Are they just trying to get more status?
 * Do they have new ideas or contacts, but don’t necessarily need to be part of
   the staff team?
 * Do they have a history of problems with specific users that might bias their
   decision making? Is that a deal breaker or do they seem capable of deferring
   those connections to other moderators who are uninvolved?

Be mindful that asking the user for their motives directly or giving any hints
about potential promotions might change their behavior. Instead focus on looking
for clues on how they interact with the rest of the community in order to
determine their reasons.

That being said, here are some things to keep in mind when deciding to vet a
potential mod. While vetting users in this manner may be a better fit for larger
communities, it's maybe less applicable to smaller, more intimate servers. It's
worth noting this is an example and that the vetting process differs depending
upon the size of the server and the servers' needs, but regardless of server
size candidates should show a dedicated and long-term invested interest and
desire to help in the community.


After you’ve made your selections and you (and your mod team!) are comfortable
with having them join the team, it’s time to consider how to onboard them.


ONBOARDING

One of the most important parts of bringing new people is the onboarding
process. A well thought out process can make people effective faster, with
minimal misunderstandings and mistakes from the onset. Utilizing the techniques
and implementing the tips in 302: Developing Moderator Guidelines can help you
create a guideline for your moderation team that makes onboarding easier for all
parties involved.


CONCLUSION

The efforts put into the early stages of moderator management pay off in
multitudes down the line. Having a deliberate and carefully thought out process
for selecting your mods ensures that your moderation team grows in a stable and
effective way with your server. The foundation of any good community is a well
moderated community, and the foundation for a well moderated community lies in
having great moderators.


BAN APPEALS

Safety and moderation

Creating your server and establishing rules are necessary parts of starting a
community. Likely sooner rather than later though, you’ll have to deal with
users who refuse to obey whatever rules you’ve set. These individuals can be
dealt with in a number of ways, including warning, muting, or even banning them.
Regardless of the consequences, users may want to appeal a moderator’s action to
prove that they did not do anything wrong, do not deserve the punishment given
to them, or argue that the punishment should be less severe than initially
prescribed.


WHY SHOULD YOU CONSIDER AN APPEAL SYSTEM?

For most types of warnings this can be as simple as messaging a moderator or
sending a message in an appeal channel, if one exists. However, because a ban
removes the user from the server and prevents them from returning, dealing with
those ban appeals requires extra consideration from the mod team.

For example: A user is banned for six months due to spamming racial slurs but
feels that they have learned their lesson and should be allowed to return after
three months. How should they go about reaching out to the moderator team to
communicate their intentions and make their case? How should the mod team intake
this information and send a response?

We all make mistakes in life and your members or moderators are no exception to
that rule. Whether it is a user realizing what they did wrong or a moderator
making a mistake, an appeal system will provide a clearly documented method for
the user to get a review of their case by the server’s moderation team. Without
an established system, users may try to direct message a moderator to appeal
which may not lead to a fair evaluation, or will try to evade the ban altogether
with an alternative account.


TYPES OF APPEAL SYSTEMS

The first step to adding an appeal system to your server is examining the
different possibilities that are available to you. To pick the appeal system
that is right for your server you should consider the size and type of server in
order to properly match it to your use case. Whatever method you choose it
should always be clearly communicated to your users.

EMAIL

Requesting users to appeal a ban can be done via email. For example, you can set
up a simple email address for people to message, such as appeals@yourdomain.com

Pros

 * You can decide if you want users to use a certain format or more like write
   whatever looks necessary for you
 * Allows you to take advantage of your mailbox’s sorting, labeling, and
   prioritization features
 * Your mailbox will always be up & running (as long as your server is), no
   worries about having outages or downtime in Discord

Cons

 * Appealing users will need to share an email address with the staff team
 * The mailbox is susceptible to spam
 * There is no sure way to verify that the sender is who they claim to be

DISCORD BOT

Using a Discord bot for ban appeals is another option. Users will have to send a
direct message to the bot to create a ban appeal.

Pros

 * Everything is kept on Discord, no need to go to another platform for
   appealing
 * Most moderators are actively checking Discord more than they are their
   mailbox. This can result in a more expedited appeals process.

Cons

 * The bot could have an outage or downtime, which would result in losing ban
   appeals or not receiving them
 * There is a chance the user won’t remember the command to appeal or forgets
   how to appeal via the bot
 * The user’s privacy settings may prevent them from being able to DM the bot
 * Users will have to share at least one mutual server with the bot in order to
   initiate the DM, which means they will either need to invite the bot to their
   own server in advance or join a separate server with the bot.

DISCORD SERVER

You can have a separate Discord server used specifically for appeals (either
through a modmail bot or plain chat). You can combine this with the option of
using a bot for appeals which would help to avoid a situation where certain
privacy settings prevent users from sending a Direct Message to the bot.

Pros

 * Everything is kept on Discord, no need to go to another platform for
   appealing
 * The appeal process is simple and you can’t fake identities
 * In case you don’t want some users to appeal/re-appeal you can block their
   ability to appeal by banning them from the server

Cons

 * Banned users can spam the server

WEBSITE

If your community already uses a website, it is a good idea to integrate your
appeal process to the website. This form can be enhanced via the Discord API, by
requiring the user to log in to their account. Another possible enhancement is
using a webhook to submit ban appeals, so that moderators can see all the
incoming appeals straight within Discord.

Pros

 * Users can easily fill out an appeal by answering questions in a form
 * The appeal submissions are integrated with Discord
 * Since users must login to Discord in order to submit an appeal, there is no
   chance that an appeal could be faked or illegitimate.

Cons

 * This system requires technical expertise to implement and might cost money to
   keep up and working.
 * Normalizing the use of the form may be difficult, as this is an atypical
   method of handling ban appeals
 * The website may be affected by outages
 * The user won’t get feedback on their ban appeal without a moderator reaching
   out

ONLINE FORM

For smaller servers, the use of an online survey form (such as Google Forms) is
pretty easy & effective. It may be difficult to scale well however, depending on
the form platform you’re thinking of using.

Pros

 * Filling out a survey is easy and self explanatory
 * Making changes to the form to suit your needs is also easy

Cons

 * If you’re thinking of using forms that require an attached email address to
   answer, you’re requiring users to share their private email with the staff
   team
 * No possibility to let to user know about the status of their appeal
 * You can’t verify whether the person appealing via the form is the person who
   is banned from your server


HANDLING APPEALS

When your server receives a ban appeal, there are a lot of things to consider.
This non exhaustive list can be used as an initial checklist for your mod team
when evaluating appeals.

 * Check if the content of the appeal is correct. First, you want to check the
   content of the appeal. Are all the questions completely answered with valid
   responses? Are there any obvious troll messages or responses? Does the user
   understand what action they are appealing, and why it was wrong?

 * Read their appeal thoroughly. Carefully read their appeal and identify all
   the details. Review the logs or history of the user in the server to find the
   context in which the moderation action against the user happened. You can
   also use the context and history to ascertain if they were truthful in their
   appeal or not. Have a look at their history: has this user been punished
   before, especially in regards to the action that got them banned? The type
   and frequency of their misbehavior will help you decide whether or not their
   ban reason is a chronic problem, or a one time mistake. This will make the
   process easier. If the user lies in their appeal, jump to step 6.

 * Look at the punishment reason. The severity of the infraction(s) should be
   considered when evaluating an appeal. For example, being a minor disruptive
   nuisance is a very different offense to doxxing someone and threatening to
   murder them.

 * Was the punishment executed correctly? The next step is going back in time,
   finding the moment of the ban and auditing it. Did your staff member punish
   this user correctly, or were they being too strict? If you're not sure,
   contact the staff member who applied the ban and discuss it with them. This
   is where keeping good logs of moderation actions really comes in handy!

 * What are the risks or benefits associated with unbanning the user? On one
   hand, the user could be demonstrating a concerted effort to change and they
   may be willing to become a positive force in the community. On the other
   hand, the user could be appealing just to continue their bad behavior once
   unbanned, and since that behavior appears to have been forgiven, other users
   may think that that user’s actions were less severe than they had assumed.

 * Discuss it with the moderators: Consider the results of answering the
   previous questions and decide on a final course of action as a mod team. If
   at an impasse, consider using a poll or deferring to more senior staff if
   necessary to formally tally a decision.


CONCLUSION

In this article, we discuss what appeals are, methods of adding an appeal system
to your server, and some information to consider when handling an appeal. Ban
appeals are an important part of your server’s growth and should be treated with
thought and care. As each server is different from the next, it‘s recommended
that you try and find what works best for your server. The key points to
remember and consider about this article are:

 * The use of appeal systems is different in every server, find out what system
   would be the most suitable for your server. Check the pros and cons, and how
   each impacts your server.
 * How should you handle incoming appeals? What are the things you should look
   at and consider before deciding?
 * The decision on an appeal requires careful deliberation. Your decision can
   have an impact on not only the user in question, but your server and other
   members as well.


BEST PRACTICES FOR REPORTING TOOLS

Safety and moderation

As a community grows, there will be a need to allow members of your community to
report bad behavior and violations of the rules to your moderation team. There
are a wide range of options to choose from, which will be covered in this
article. The list of tools is non-exclusive as you might want to combine
multiple reporting options for your users to best suit your needs.


IMPORTANCE OF REPORTING TOOLS

Every community can benefit from having at least one type of reporting tool in
place. This will help you keep your community a better and safe place for your
members, encourages involvement from members in your moderation process, and
allows for better understanding of the challenges your community is facing in
terms of moderation.

Having a predefined method for users to contact the moderators will also improve
the workflow, ensure better alignment of punishments and a fair process to
punish those who violate your community rules.


BENEFITS AND CHALLENGES WITH DIFFERENT REPORTING TOOLS

Different reporting options have varying benefits and challenges. You will need
to find the right balance between the privacy of users, usability for members
and moderators, as well as deciding whether or not you want conversation to be
possible between your moderators and the reporters. The best option for your
server will depend entirely upon your needs. Not all communities will benefit
from having reporting channels and you might not want your moderators to handle
issues in direct messages. Down below you will find a list of the most commonly
used reporting options available, with their respective benefits and
disadvantages.

Modmail Bots

Modmail bots can be used for members of your community to report violations to
the moderation team as a whole. There are several available options to choose
from, all with their own unique viewpoint on support tickets. By using a Modmail
bot, you will be able to have conversations with your reporters. Most other
reporting tools do not allow you to have private conversations. Furthermore,
Modmail bots allow for a lot of customization at the cost of being more
time-consuming and difficult to set up. Additional benefits of using such a bot
is being able to attach screenshots to a report and maintain privacy for both
the mod handling the case and potentially even the user providing feedback,
which some of the options do not offer.

Reporting Channels

Some servers utilize a specific channel dedicated to reporting violations of the
rules. These can take different shapes depending on the use case. Different
approaches include only allowing members to copy and paste message URLs to rule
breaking messages or a more open discussion-based channel that provides a
public-facing main source of reports. One thing is certain across all
implementations of reporting channels: there should be a way to communicate with
people that a report has been handled by either removing a message in the
channel or reacting to it publicly. Reporting channels are very easy to set up
as they use native Discord features. However, these channels can be limiting in
functionality. Conversations are barely possible, depending on the permissions
members might not be able to upload screenshots, and reports are not private
which can deter some people from sharing concerns due to intimidation and fear
of retaliation.

Reporting via Direct Messages

Another option is to handle reports via direct messages. In this case, users
simply send a message to any available moderator. This method allows for private
reports, back-and-forth conversations with a member of the moderation team, and
the ability to share screenshots. Using this method, players will have to rely
on the availability of a single moderator, as they will be choosing who to
report to. Furthermore, it is very hard to track moderator activity and if this
is important to you, opting for this method should be avoided. Not all
moderators will give a follow up to users and there is limited accountability.

Report Commands

Some bots allow members to report others for breaking the rules. Reports often
follow the format of tagging the person being reported and providing a reason.
This information is then directed to the moderation team via a private channel.
While the report is private, members still have to use a command (and ping the
person they are reporting) in public channels. It also does not allow
conversations to take place and uploading screenshots is not possible.

Pinging Moderator Roles

Sometimes you simply want to allow members to ping a moderator role in case of
public rule breaking behavior. This will most definitely immediately get the
attention of your moderators and is most often the quickest reporting method to
use. However, the user being reported will be notified as they can see the ping,
and they might be able to remove their messages before a moderator can
intervene. A way to help counteract this is to consider utilizing a moderation
bot that logs message deletions or edits.

Logging and Flagging Messages

While a lot of popular bots offer auto moderation features that include word,
spam, and advertisement filters, these can most often also be used to silently
inform moderators. Instead of removing the message automatically and giving a
punishment based on predefined rules, an option to create a flagging mechanism
without automatic action can be utilized. If there’s a certain word or phrase
used on a flag-list, the bot can notify moderators to look into it and decide
the best course of action instead of automatically acting the way a blacklist
would. This method allows your moderators to go through a channel with flagged
messages and issue punishments based on the context of flagged messages and
removes certain automatic moderation techniques that can lead to
over-moderation.


CHOOSING WHAT WORKS BEST FOR YOU

What method, or combination of methods, works best for you, depends a lot on
your server size and needs. For smaller servers up to five thousand members
being able to ping moderators in case of reports or having a report channel
should be easily workable, depending on how big your team is. For large servers
of fifty thousand members or more, you should look into using a Modmail bot. You
might want a combination of multiple methods as well, depending on what works
for you and the capabilities of your team.

Consider whether or not moderators should be able to communicate with the
reporter about their report, not all methods offer this functionality. Ask
yourself if the privacy of someone violating the rules is important to you, as
not all methods are private reports.

If you are using a levelling or experience system in the server, you might want
to offer different reporting tools to members of a server level. For example,
you might want to implement a command that can ping moderators, but only make
that available to those of a certain level to prevent being pinged for minor
moderation issues or by trolls abusing the pinging power. It is not advised to
use a ping-based reporting method in larger servers due to how easily it can be
abused.

Do you want to track the activity of your moderators? This is not possible for
some methods. If your rules forbid advertisement via direct messages and
moderators are to take action upon this being made aware of this violation, you
should choose a reporting method that allows for screenshots to be shared.


SUMMARY

Every community can benefit from having different reporting tools in place as
this will help you keep your community a safe space for your members. Different
reporting options have different benefits and challenges and you should take the
time to analyze what option may best fit the needs of your community and
moderation staff.

There are six distinct reporting methods available via Discord’s native
capabilities and bots:

 * Reporting via direct messages
 * Utilization of a Modmail bot
 * Creating a report channel with rules specific to your needs
 * Using a report command that will ping reported members, but be shared
   privately with a moderation team
 * Pinging a moderation team with an emergency team-wide ping
 * Message flagging via automated mod

Your server size will often dictate the best combination of the above options to
find what reporting tools will work best for your team from a moderation
standpoint, but you should also keep in mind the preferences of your community
when it comes to reporting violations as your users also have to be comfortable
when it comes to using your chosen report methods.


KEEPING YOUR SERVER & COMMUNITY SAFE

Manage


TOP TIPS FOR WRITING YOUR OWN RULES

Rules don’t just help encourage positive behavior, they also prevent negativity
from spreading as your community grows.

 * Simplify Your Rules: Keep your language simple, clear, and direct.‍
 * Don't Get Too Specific: The further into the weeds you get, the more people
   can try to find ways to act around it while still “following the rules.”‍
 * Emphasize the Spirit: Sometimes, a user won’t break a written rule but the
   spirit of your rules. It’s important to use your guidelines to promote the
   right kind of behavior too.‍
 * Know When to Adapt: If changing your rules will make your community a happier
   and more supportive place, do it!

BE FLEXIBLE AND UNAFRAID TO ENFORCE THE RULES

Once you’ve written your rules, it’s important to enforce them consistently.

While it may feel uncomfortable at first, enforcing your rules tells your
community, “we’re here, we care about you, and we’re going to keep this
community safe.”

Before you grab your ban hammer, try to remember the human side of community
moderation. Being able to talk to users, de-escalate arguments, and give second
chances before escalating things is key to a healthy community.

Remember, people make mistakes, and a good community can help change people for
the better too.

WHEN TO ESCALATE

If you’ve tried working with them directly and had no luck, it’s time to
escalate things. You have a few key tools available in this situation: Time Out,
kick, or ban.

 * Time Out: This will temporarily prevent a user from interacting in a
   server—this includes sending or reacting to messages and joining voice
   channels or video calls.‍
 * Kick: This will remove a member from your server but leave them with the
   ability to rejoin at a later date. They won’t be notified of the kick.‍
 * Ban: This removes a member from your server and stops them from ever joining
   again.

To action a member on your server...


 * Go to Server Settings and find Members listed under User Management. 
 * Click on the three dots next to a user.
 * Choose any of the three options: Time Out, kick, or ban.

Or you can right-click on the member you want to ban from your server!

WHAT KIND OF CONTENT SHOULD I BAN FOR?

Unfortunately, there’s no hard-and-fast rule here; unacceptable conduct or
content will vary for every server.

However, if there are people constantly posting content that directly breaks
your server rules or Discord’s Terms of Service, it may be time to remove them
from your community. For more information, check out our Community Guidelines.

Whether through an escalation process or a ban for extreme rule-breaking,
content that crosses a line should be moderated and – if necessary – banned. You
may also want to report this behavior to our Trust and Safety team so we can
take further action.

“What can I do to deal with users who are not technically breaking rules, but
are problematic?”

Again, this is where it’s important to be human and engage with the user to try
and stop the behavior before you need to ban or kick them.

 * Lead by Example: Encourage your team to act in a way that is positive for the
   community’s health; others will follow.‍
 * Enable Member Reporting: Make sure your community feels they can come to you
   for help.‍
 * Practice De-Escalation: Foster a positive environment that allows for second
   chances and reform in your community. The Time Out feature is a great tool to
   help.


KEEPING YOUR COMMUNITY SAFE

There is no one-size-fits-all solution to safety, but an admin should know a few
basics to help their moderators.

 * Use the tools available to you from Discord. Enabling Community allows you to
   take advantage of automated moderation tools like AutoMod.‍
 * Implement the basics and evaluate if you need support from bots. Head to
   Securing Your Server to cover the basics and explore bot recommendations from
   the Discord Moderator Academy.
 * Take the time to educate everybody on the rules and etiquette of the server.
   Having these shared publicly will help align members as they join. 
 * Make sure your inner circle can submit reports to Trust & Safety. All
   moderators and admins should be equipped with Developer Mode (Settings >
   Appearance > Developer Mode) so they can right-click users and messages and
   collect their IDs.

THIS IS YOUR PARTY

Ultimately, nobody knows your community better than you, so trust your gut. If
something feels off and not quite right to you, the chances are that your
community is feeling it too.

As long as you’re consistently putting the safety and happiness of your
community first, you’re doing it right.

IMPORTANT RESOURCES

If you or your moderators do need to report a user or any unsafe behavior, here
are some helpful links you’ll need to submit a ticket to our Trust & Safety
teams.

Top Tip: IDs must be collected before bans are made that delete messages, as IDs
cannot be retrieved once deleted.

 * Report Harassment and Harm
 * Report General Issues
 * How to Report to Discord
 * Discord’s Advice on Moderation

With thanks to Deku ♡#1313 for all their help in writing this article.


IDENTIFYING AND ASSIGNING SERVER MODERATORS

Manage


GETTING YOUR TEAM TOGETHER

Sooner or later, we all need help managing things. As your community grows,
you’ll find yourself looking for help moderating and keeping your community
healthy.

HOW DO I KNOW WHEN I NEED MODERATORS?

Typically, you’ll know you need to bring in moderators when enforcing server
rules becomes difficult to manage on your own.

If chat is crazy active or going at all hours of the night, great moderators can
organize chaos and enforce guidelines to give you more time to manage your
community strategy.

FINDING THE RIGHT MODS FOR THE JOB

Finding mods who are a good fit for your community can feel daunting, so start
by asking yourself what you want a candidate to bring to your community. Then,
you can work out what their responsibilities might involve from there.

Ideally, you should find people that are friendly, approachable, collaborative,
receptive to feedback, and good at on-the-fly thinking.

They should also be able to embody and enforce the values of your community
while remaining an active part of it. Consider a current and active member of
your server—this ensures they’re already familiar with your community!

Think of the users on your server who are responsible enough to take on the job;
the people who have demonstrated that they have an active mind for the safety
and well-being of your community and the people in it.

Think about more personal considerations. Do they get along well with other
members of the server? Are they considerate? Would you want to collaborate and
work with this person long-term?

The three boxes you’re aiming to check off are that the potential candidates
are: active, responsible, and friendly. Once you’ve found some candidates, it’s
time to work out how they can help your community…

THE ROLE OF A MODERATOR

Choosing what your new moderators do in your community really depends on your
community's needs and any problems your server may be running into. 

However, here are some common moderator responsibilities you may want them to
handle...

 * Delete any unwanted or inappropriate messages and report these to Discord
   when they are against our Community Guidelines or Terms of Service.
 * Kick members or allocate server timeouts.
 * Hand out warnings to any members acting against the rules or guidelines.
 * Support with responding to questions or queries about your community.
 * Ban spam bots that have appeared in the server.

CREATING A MODERATOR ROLE

Once you’ve selected and spoken to the people you think are the right fit for
the job, it’s time to create an actual server role and make sure they have
everything they’ll need to keep your community safe.

You can even call this role something other than Moderator (perhaps if there’s
an inside joke or word that matches your server’s theme), but just make sure its
purpose can be clearly identified as someone to go to for help.

Critically, your Moderator role does not need to include every permission. As an
admin, you should be aware of the dangers of giving new mods every server
administration power under the sun, even if you trust them.

Here is a list of permissions that moderators should have. If you’re doing an
additional trial period with mod candidates (which is a great idea), don’t be
afraid to roll some of these out only after the candidates have become full
mods.

Day-To-Day Permissions

 * Manage Messages
 * Kick Members
 * Time Out Members
 * Ban Members: Some people hold off on this, so if you’re doing a trial period,
   feel free to wait until they’re full mods before granting it.
 * Manage Roles: Even if you have a bot to manage muting people, make sure your
   mods can also do it manually if the bot doesn’t work.
 * Mute/Deafen Members: Do this even if you don’t have voice channels. You might
   need it later on!

With that being said, here is a list of permissions your moderators shouldn’t
typically need. For all of these, ask yourself when a moderator needs these
permissions, why, and for how long.

As Needed Permissions

 * Mention @everyone or @here: Unless there is a specific reason mods need to do
   this, use your own judgment!
 * Manage Channels
 * Manage Webhooks
 * Manage Server
 * Administrator Permissions: This is every permission in one! Avoid giving this
   out period!

Once your moderators are assigned and set up with the relevant roles and
permissions, it’s time to introduce them to your community. Set up a casual
voice chat or announce new moderators in the server; visibility is a great way
to let the community welcome with new moderators and adjust to their role.

Finally, make sure all admins and moderators secure their accounts by setting up
2FA!

For more in-depth information on the deeper aspects of selection, mod
responsibilities, and how to build strong moderation teams, check out great
resources from the Discord Moderator Academy like...

 * ‍Recruiting Moderators
 * Managing a Team of Moderators
 * How to Avoid Moderator Burnout

With thanks to The Tax Collector Man for all their help in writing this article.


MODERATION & COMMUNITY SUPPORT TO MANAGE YOUR SERVER

Manage


TOOLS FOR AUTOMATING SERVER MANAGEMENT

While Discord recommends being as hands-on and human as possible, there may come
a time when you want to bring in some additional support to help out. 

Here are a few ways you can scale your community and automate manual processes.

DESIGN YOUR SERVER FOR SAFER USE 

 * Turn on Explicit Media Content Filters: Permit Discord to automatically scan
   and delete any media in the server that contains explicit content with a
   filter. You can choose to scan nothing, everything, or only content from
   members without a role. Don't delay and set a filter today, especially if
   your server is public!‍
 * Set up Verification Levels: In your Server Settings, you’ll find the option
   to choose from a series of Verification levels. From None to Highest, you can
   set up gates to prevent spam and require members to verify their identity
   before they can interact in your server. 

IMPLEMENT YOUR AUTOMOD

Designed to assist your hardworking moderators, Discord’s AutoMod feature helps
keep your server clean and safe by automatically filtering out harmful or
undesirable messages around the clock.

Choose from filters made by Discord or customize your own with lists of words or
phrases and the actions AutoMod will take when they’re found in your server.
This allows you to…

 * Keep Community Conversations Welcoming: AutoMod detects and acts on messages
   containing harmful words or phrases before they’re ever posted, reducing
   immediate harm and exposure for your members and moderators.‍
 * Enjoy Peace of Mind: AutoMod alerts your team when members try to use words
   from your list. Assign actions like Timeout from the alert to diffuse
   difficult situations and reduce the need for your moderators to be everywhere
   at once.‍
 * Allow for Nuance: No one can fully replicate a moderator’s eye for context,
   but Wildcards get close. Wildcards allow AutoMod to detect variations that
   partially match listed words or phrases and flag when members try to
   circumvent filters.

Explore our Help Center article to adopt your AutoMod today.

Set Up Moderation Bots

Some servers also use moderation bots to handle things like banning keywords or
handing out warnings to offending members.

A well-configured Discord moderation bot can be set up with different rules for
different channels and automatically take action should any users violate those
rules. You can also set them up to provide reports and keep your admins or mods
up to date on any actions it takes.

By setting up some levels of auto-moderation, you and your human mods can
concentrate on more significant issues while the bot is busy taking on smaller
things. But setting up auto-moderation is a balancing act.

What Are the Pros of Automating Moderation?

 * React Quickly: When your community is still in its early stages, handling
   moderation is easier. However, as your community grows, you might find that
   you need help staying on top of everything.‍
 * Protect Your Time: You don’t want to be online 24/7 to watch every message
   for signs of trouble. Setting up automation can handle or streamline harmful
   situations to prevent burnout. ‍
 * Fair Judgment: To err is human, after all. As you create cool content for
   your community, consider assigning more tasks to your auto-moderators and
   reduce the chance of biased decision-making.‍

Finally, remember to regulate the consequences. As your community grows, things
may get a little more complex. For example, if someone posts irrelevant content
in a channel just once, they may just need a quick reminder of the channel’s
topics. But if they’re repeatedly ignoring the channel topics and spamming with
inappropriate content, more severe consequences may be needed. Auto-moderation
can keep an eye on repeat offenders and act accordingly.

It’s important that all types of moderation and support remain as fair and
consistent as possible. Remember to hold everyone in your community to the same
standard, regardless of who they are or their history.

Important Reminders

 * Not every violation will have a simple solution. Situations may require some
   more in-depth people management or further escalation, so it’s recommended to
   leave context-based decisions to a human rather than a bot. 
 * Don’t give bots or humans more permissions than are absolutely necessary.

For more on bot configuration and best practices on moderation, head to the
Discord Moderation Academy, and check out this article.


SCALING COMMUNITY SUPPORT

IS IT TIME TO SCALE YOUR COMMUNITY SUPPORT?

It’s essential to build a community where members feel heard and supported.
Although being on hand to help is a cornerstone of a healthy community, that can
get harder as your server starts to grow.

That’s when it’s time to scale your support.

Ways to Scale Your Server Support 

 * Introduce FAQs: As your community becomes more established, you won’t be able
   to answer every question individually. Instead, make key information
   accessible by turning a running list of frequently asked questions into an
   #FAQs-channel. Keep the channel read-only and direct members there before
   they ask a question.‍
 * Create Dedicated Spaces for Asking Questions: A support channel will keep
   questions out of the general chat and make it easier for you or anyone else
   to answer any questions.‍
 * Be Transparent: Let your community know when you’ll be online by pinning
   office hours to your support channel.‍
 * Peer-to-Peer: You’re only one person who can only do so much but where
   possible, encourage an atmosphere where all members can support each other.
   If some community members are already doing this, reward them with a unique
   role. Others will know who to look to for support and be inspired to start
   doing the same.  

It’s important to find the right balance of humanity and auto-moderation. Build
for safety and use automation as a tool to let you focus your energy on other
important aspects of managing your community. 

With thanks to LogoCat for all their help in writing this article.


YOUR RESPONSIBILITIES AS A DISCORD MODERATOR



You may already know by now that you can report content to Discord's Trust &
Safety team if it breaks Discord's Community Guidelines or Terms of Service.
However, while moderating a server you may come across situations where you are
unsure of whether you should make a report or not. This article will give you a
good idea of what sorts of things are reportable, and what things are not worth
reporting.


HERE ARE SOME RULES FOR CONTENT ON DISCORD:

Do not organize, participate in, or encourage harassment of others.
Disagreements happen and are normal, but continuous, repetitive, or severe
negative comments may cross the line into harassment and are not okay.

 * Disagreements, insults, and other rude or disrespectful behavior is common,
   and should generally be handled in accordance with your own server rules.
 * Do not organize, promote, or coordinate servers around hate speech. It’s
   unacceptable to attack a person or a community based on attributes such as
   their race, ethnicity, national origin, sex, gender, sexual orientation,
   religious affiliation, or disabilities.
 * If a user is participating in any of these actions aimed at another person or
   community, it is usually reportable.
 * Do not make threats of violence or threaten to harm others. This includes
   indirect threats, as well as sharing or threatening to share someone’s
   private personal information (also known as doxxing).
 * Do not evade user blocks or server bans. Do not send unwanted, repeated
   friend requests or messages, especially after they’ve made it clear they
   don’t want to talk to you anymore.
 * If a user is evading blocks or bans for the purpose of repeating their
   previous bad behavior, they should be reported.
 * Do not send others viruses or malware, attempt to phish others, or hack or
   DDoS them.
 * Viruses, malware, and phishing attacks are reportable as this content is
   directly against Discord’s Community Guidelines.


HERE ARE SOME RULES FOR CONTENT ON DISCORD:

You must apply the NSFW label to channels if there is adult content in that
channel. Any content that cannot be placed in an age-gated channel, such as
avatars, server banners, and invite splashes, may not contain adult content.

 * Isolated incidents where NSFW content is posted outside of an NSFW channel
   should be taken care of on a server level. If a server regularly fails to
   remove NSFW content posted outside of NSFW channels, the server may be
   reported for this behavior. 
 * You may not sexualize minors in any way. This includes sharing content or
   links which depict minors in a pornographic, sexually suggestive, or violent
   manner, and includes illustrated or digitally altered pornography that
   depicts minors (such as lolicon, shotacon, or cub). We report illegal content
   to the National Center for Missing and Exploited Children.
 * Content with minors being sexualized is taken very seriously and should be
   reported and removed as soon as possible. 
 * You may not share sexually explicit content of other people without their
   consent, or share or promote sharing of non-consensual intimate imagery (also
   known as revenge porn) in an attempt to shame or degrade someone.
 * You may not share content that glorifies or promotes suicide or self-harm.
 * Small, one-time offenses, or users who talk about self-harm in reference to
   themselves should be handled on a server level. Users who continue to promote
   self-harm either for themselves or others should be reported.  Users who are
   in physical danger due to a potential suicide attempt or self-harm should be
   reported as soon as possible. You should consider calling law enforcement if
   you believe the user may be in imminent danger to themselves or others.
 * You may not share images of sadistic gore or animal cruelty.
 * You may not use Discord for the organization, promotion, or support of
   violent extremism.
 * You may not operate a server that sells or facilitates the sales of
   prohibited or potentially dangerous goods.
 * You may not promote, distribute, or provide access to content involving the
   hacking, cracking, or distribution of pirated software or stolen accounts. 
 * In general, you should not promote, encourage or engage in any illegal
   behavior. This is very likely to get you kicked off Discord, and may get you
   reported to law enforcement.


FINALLY, WE ASK THAT YOU RESPECT DISCORD ITSELF:

 * You may not sell your account or your server. Users who want to buy or sell
   an account or server should be reported. Try to be sure they are serious
   about it before reporting.
 * You may not use self-bots or user-bots to access Discord. Users who use a
   self-bot or user-bot in a malicious way should be reported. Self-botting in
   this case includes users setting up a bot or automations that allow them to
   perform actions on Discord faster than any human is physically capable of.
 * You may not share content that violates anyone's intellectual property or
   other rights. In general, this guideline should be handled by the users
   involved, usually by means of a DMCA request. Instructions on properly filing
   a DMCA takedown request can be found in Discord’s Terms of Service, and must
   be initiated by the rights-holder of the intellectual property or an
   authorized representative. You may handle these situations on a server level
   if you wish.
 * You may not spam Discord, especially our Customer Support and Trust & Safety
   teams. Making false and malicious reports, sending multiple reports about the
   same issue, or asking a group of users to all report the same content may
   lead to action being taken on your account.
 * If you have evidence of users participating in this behavior, you should
   report it. Users impersonating Discord employees may also fall in this
   category. When submitting a report ticket about an issue, reply to the
   support ticket to bump it, but do not create a new ticket as you may end up
   violating this community guideline.


CONCLUSION

This is not an exhaustive list of the situations you may encounter as a
moderator, nor is it a guide for what rules you should enforce on your server,
as the platform is ever-evolving. You may enforce extra rules as you see fit in
order to cultivate the type of community you want for your server.


MODERATOR ETIQUETTE FOR YOUR DISCORD SERVER



Discord Moderators exist on every Discord server in different forms. With the
constant growth of Discord as a whole (and as a direct result, servers on
Discord), moderators’ roles have grown in importance and necessity in the
pursuit of keeping chat flowing and providing users with an enjoyable but safe
environment on their server. But what does it really mean to “moderate” a
server?


WHAT IT MEANS TO BE A DISCORD MODERATOR

A community thrives in a healthy, comfortable and safe environment - and that's
where moderation comes into play. Being a moderator means more than just being
hoisted at the top of the server and having a fancy color. It is your
responsibility to prevent and resolve conflicts between users, ensure the server
is safe and free from potential harm, and set an example for the rest of the
community to the best of your abilities.

While bigger servers may be less lenient when it comes to second chances,
banning people over minor infractions may not always be the best approach. Your
responsibility is not just to execute punishments, but also to weigh out the
severity of the infraction. Moderating larger or more active communities can be
overwhelming at first, so don’t be afraid to ask fellow moderators for help, and
take their advice on improving your methods of moderation. While moderating,
always be friendly and ready to help users in public. Find a balance between
enforcing the rules, while also fostering a healthy relationship with users.
“Aggressive” moderators tend to intimidate or scare off new users, which will
harm your community.


BEHAVIOR IN THE SERVERS YOU MODERATE

Having multiple permissions, a special role color meant to differentiate you,
and power over users doesn’t mean you are freed from the servers’ rules or can
act above them. Instead, you should encourage users to abide by the rules at all
times while still being able to enjoy their stay on the server. Showing that
you're dedicated to helping the server grow and by being a fair and trustworthy
figure for the community goes a lot towards overall server morale. A key part of
achieving this is to ensure you don’t do anything that would get normal users in
trouble, i.e. flooding emojis or repeatedly writing in capital letters only.

Moderators are seen as role models for the server. That means that you should
act maturely and set a good example for the community. That includes but is not
limited to, obscene behavior in both your messages and Discord profile: your
picture, username, status, and linked socials are globally visible. Running a
Discord server will result in members from different nationalities and
backgrounds engaging in the community. Furthermore, in case of a dispute
happening on the server, focus on the logic of the argument rather than the
individuals involved. Fair and equal treatment for everyone should be the
standard for moderation actions.

Nobody is perfect and nobody expects you to be perfect. However, when you find
yourself in such an influential position, you need to have an open mind and
learn how to accept constructive criticism. Making mistakes is understandable as
long as you take responsibility for your actions and learn from them.


ENGAGING WITH USERS 

User engagement and activity is one of the essential aspects of running a
Discord server successfully. In terms of smoothly conversing with a user, it is
recommended to take the following points into account:

USERS NEW TO DISCORD

You should never assume that everyone knows how Discord works, or how your
server operates. If there is the case that you come across a user who has
multiple, seemingly “ridiculous” questions, don’t immediately assume they are a
troll. There are many ways to get confused by things that may seem natural to
superusers of the platform. Take your time to explain certain parts or functions
of both Discord and the server you're moderating while keeping a friendly and
inviting tone.

COMMUNICATION

Online communication cannot accurately convey our usual ways of expression:
facial cues, emotions, physical gestures, and your vocal tone. This makes it
easy for others to misinterpret your messages, and for you to misunderstand
theirs. When reading and analyzing a message without those real-life factors, it
often happens that our own emotions fill in those blanks and that
misunderstanding encourages us to act in the heat of the moment. Before acting
on negative emotions, give the other user the benefit of the doubt and calmly
ask them for clarification on parts of their messages that upset you without
being accusatory.

When sending messages, there are many ways to convey what you really want to
express including emojis or other special symbols such as tonal indicators.
Always make sure to read through your messages as a moderator and think “how
could that be misunderstood in a way that it upsets someone?”, and adjust based
upon that thought process.

Furthermore, you may encounter users whose first language isn’t your servers’
primary language. Even though you may have a rule in your server that asks users
to speak in the designated language only, it’s usually not productive to
immediately discipline someone for violating that rule, especially if it’s their
first post in the server. Instead, consider starting with a verbal warning anda
reminder to stick to the server’s language rules. ‍

USER PRIVACY

It may happen that a friend of yours joins the server or you become close
friends with fellow members or moderators on the server. Never post personal
information about another server user without their explicit permission, even if
it is meant in a joking way. Let everyone explore or open up to the server and
its community at their own pace. Only after that, with their consent, should you
mention them by their name in public.


TEAMWORK IS KEY

Teamwork makes the dream work and it is important to maintain a healthy,
communicative, and respectful relationship with your moderation team to ensure
easy moderation of your community.

When dealing with moderation issues, seeking help from fellow staff members
always seems like optimal assistance. Getting another person’s opinion on a
topic may help you to see things from a different angle, or reinforce your
judgement. Taking everyone’s perspective into account can help you master even
the most difficult problems and it takes weight off your shoulders to let other
people know about your concern. You are part of a team, and do not have to act
alone.

Another consideration when it comes to public appearance is respect for your
fellow moderators. A successful staff team flourishes when all of its members
work together. It is not expected of you to become the best of friends with
every single staff member, but one thing you should never do is talk badly about
fellow staff in public or do things you know will cause a bad reaction from
their part. If you have any issue regarding them, you should address it in
private or get your team’s upper management involved if there are more severe
issues going on.


Taking up the moderation mantle can be a very fulfilling duty. Being in such a
position means you can help people in ways others cannot, and even the smallest
“thank you” or a nice gesture of appreciation can brighten up your day.


PERMISSIONS ON DISCORD



Once you become a moderator, it’s important to know what tools are at your
disposal to help manage your Discord server. While there are many bots that
cover both manual and automatic features of moderation, understanding how these
bots work will be difficult without knowing how Discord’s native moderation
features function. Due to the discrepancies in functionality between the mobile
and desktop clients, throughout this article, the instructions for navigating
Discord will be in terms of the desktop client. Before reading this article, it
may be useful to learn more about role and channel permissions here.


TURNING ON DEVELOPER MODE

The first thing you should do is turn on developer mode for Discord. This will
allow you to copy user, channel, and server IDs which are extremely helpful for
moderation, reporting issues to Discord, and dealing with Discord bots. Read
here for instructions on turning on developer mode and getting IDs.


A NOTE ABOUT THE ADMINISTRATOR PERMISSION

The Administrator permission is a special permission on a Discord role in that
it grants every Discord permission and allows users with that permission to
bypass all channel-specific permissions. Because of this granting this role to
any user or bot should be done with the utmost caution and on an as-needed
basis.

Because bots can automate actions on Discord, bots with this permission can
instantly delete all of your text channels, remove your emotes and roles, create
hundreds of roles with the Administrator permission and start assigning them to
your other users, and otherwise cause unmitigated havoc on your server faster
than you can even understand what is happening. While the chance of this
happening with larger or more renowned public bots is low, you should be mindful
that this is the sort of power you are giving to a Discord Bot if you grant it
the Administrator permission and only do so if you are confident the bot and its
development team can be trusted.

Before giving this permission to a user, consider if giving them a role that has
every other permission enabled will serve your purpose. This way you can at
least protect your channels via channel permissions. You may also find on
further consideration that the user in question does not even need every
permission, and will be fine with only a couple of elevated permissions. If you
do give Administrator to anyone, it is highly recommended to enable 2FA for your
server as described in the next section.


ADMINISTRATIVE ROLE PERMISSIONS

Discord has several role-specific permissions that grant what would be
considered “administrative functionality” to users (not to be confused with the
actual Administrator permission). These permissions are considered sensitive
enough that if you are on a server where two-factor authentication (2FA) is
required for moderators, these permissions are disabled. You can read more about
what 2FA is and how to enable it on your account here. The permissions for which
this applies are as follows:

 * Administrator
 * Manage Server
 * Manage Channels
 * Manage Roles
 * Manage Messages
 * Kick Members
 * Ban Members


CONTEXT MENUS

If you aren’t using a bot for server moderation, your moderation is going to be
done by using Discord’s context menus. How to access each menu and how its
options work will be discussed in detail below.

SERVER SETTINGS

The Server settings items allow you to configure the server as a whole, as
opposed to managing individual members. Note that depending on the exact
permissions you have as a moderator and whether or not your server has boosts or
is verified/partnered, not all options shown may be available to you.

On Desktop: Right-click on the server name and go to Server Settings

On Mobile: While viewing the server list, tap the server name and then Settings
in the bottom right.

The important menu items for you to know are the following:

 * Overview: Requires the Manage Server permission to view. From here you can
   change the server name and region, set an AFK voice channel, set up system
   messages, change default message notification settings for members, and
   change the server invite background (unlocked at boost level 1).
 * Roles: Requires the Manage Roles permission. From here you can create,
   delete, edit, and reorder roles that are lower than your highest assigned
   role. Note that you cannot toggle permissions that you do not have on any of
   your roles.
 * Emoji: Requires the Manage Emojis permission. From here, you can upload new
   emojis, delete current emojis, or edit the name of current emojis.
 * Moderation: Requires the Manage Server permission. From here, you can set the
   verification level of the server and configure the explicit media content
   filter. If you are the server owner, you can also enable the 2FA requirement
   for server moderators.
 * Audit Log: Requires the View Audit Log permissions. This provides a log of
   moderation actions on the server. Once something is in the Audit Log, it
   cannot be deleted or edited, and as such, it is an excellent way to verify
   who has done what on the server and to troubleshoot any technical issues that
   may arise.
 * Integrations: Requires the Manage Webhooks permission. This allows you to
   manage all of the webhooks in your server, the channels that you are
   following, the bots in your server, and any platform-specific integrations
   you can manage such as Twitch or Youtube. Additional information about this
   page and the types of integration can be found here.
 * Members: Allows you to view the members of the Discord server and their
   roles. You can filter the member list by an assigned role as well.
 * If you have any permission that allows you to see the server options menu,
   you will be able to see this screen but will not be able to do anything but
   browse the members list.
 * If you have the Manage Roles permission, you can also add or remove roles
   from a user
 * If you have the Kick Members permission, you can perform a server prune or
   remove individual members from the server. They can rejoin later with a valid
   invite link
 * If you have the Ban Members permission, you can ban a member from the server.
   They will not be able to rejoin the server and their IP address will be
   blacklisted.
 * Invites: Requires the Manage Server permission. Allows you to view and delete
   active discord invite links along with their creator, number of uses and
   expiration time.

MEMBER OPTIONS

User options allow you to manage individual server members. You can manage them
from the Members server option as noted previously or through the following:

Desktop: Right-click on a user’s name anywhere in the server (online list,
mention, message, or their voice channel presence)

Mobile: Tap a user’s name anywhere in the server and then tap the “manage”
option. If you only want to kick or ban a user you can do so without tapping the
manage option. You can also copy their user ID by tapping the three dots in the
upper right instead.

The most important menu options for you to know are as follows:

 * Change Nickname: Requires the Change Nickname permission. This allows you to
   set a server-specific name for the user in question and is useful for making
   sure people’s names are appropriate.
 * Kick [user]: Requires the Kick Members' permission. This removes the user
   from the server. They can rejoin with a valid invite link
 * Ban [user]: Requires the Ban Members permission. This will allow you to
   select how much message history of a user to delete and enter a reason for
   your ban. It is generally recommended to delete no message history unless the
   user is spamming inappropriate messages or content for record keeping
   purposes, and it is also recommended to enter a ban reason.
 * Roles: Requires the Manage Roles permission. You can quickly add or remove
   roles from a user.

VOICE OPTIONS

These are accessed in a similar fashion to member options, but are only visible
while the user is in a voice channel.

The most important menu options are as follows:

 * Server Mute - Requires the Mute Members permission. This will prevent anyone
   in the server from being able to hear this user speak and lasts until someone
   else with the server mute permission unmutes them.
 * Server Deafen - Requires the Deafen Members permission. This will prevent the
   user from being able to hear anyone else speak. This state lasts until
   someone else with the server deafen permission undeafens them.
 * Disconnect - Requires the Move Members permission. This will forcibly remove
   the user from the voice channel. Keep in mind that they will be able to
   reconnect unless there is a channel permission on the voice channel that
   prevents them from doing so.
 * Move To - Requires the Move Members permission. You can move the member to a
   different voice channel from their current one even if they do not have
   permissions to connect to the target channel. They will be able to move to
   another voice channel unless there are permissions preventing them from doing
   so.

MESSAGE OPTIONS

This menu allows you to manage a specific message on the server.

Desktop: Right-click anywhere in a message, or mouse over the message and click
the three dots to the right

Mobile: Press and hold on a message

The most important options on this menu are as follows:

 * Pin Message: Requires the Manage Messages permission. This will add a message
   to the list of pinned messages in the channel for easy access later. You can
   view the pinned messages by clicking the pin symbol near the top right of the
   screen (on desktop) or by pressing the same symbol after tapping the channel
   name (on mobile). From here, you can also un-pin a message.
 * Delete Message: Requires the Manage Messages permission. This will
   permanently remove the message from Discord. If the message contains content
   that breaks Discord’s community guidelines or terms of service, you should
   also choose to report the message. Messages that are deleted without turning
   on the report option are unrecoverable even by Discord and cannot be used to
   convict someone of violating Discord’s terms of service.


ADDITIONAL PERMISSIONS

Some permissions are integrated into other areas of Discord or are more
implicit. The following permissions should only be granted to moderators or
trusted users.


 * Manage Channels - Allows users to edit channels by mousing over the channel
   name and clicking the gear, or by tapping the channel name at the top of your
   mobile device and then tapping settings. You can change the channel name,
   implement slow mode, manage channel permissions, or delete the channel.
 * Mention @everyone, @here, and All Roles - Allows users to mention all users
   on the server, all online users with access to the channel in which the
   message is sent, or all users in a specific role even if that role’s “allow
   anyone to mention this role” permission is disabled.
 * Send TTS Messages: Allows users to start a message with /tts to cause Discord
   to read their message out loud to everyone currently focused on the channel.
 * Priority Speaker: While users with this permission are talking in a voice
   channel, the volume of other users will be lowered.

While some advanced Discord server configurations may require otherwise, the
following permissions are generally good to give to everyone:

 * Change Nickname: Allows users to set their own nickname on the server,
   different from their username.
 * Read Text Channels & See Voice Channels: Allows users to see channels and
   read messages.
 * Embed Links: Allows for a preview of links sent by users. If disabled, users
   can still post links but they will no longer preview.
 * Attach Files: Allows users to upload images and files to the Discord server.
 * Read Message History: Allows users to read messages sent in the channel while
   they were away. With this permission off, users are unable to read messages
   in the channel if they go to another channel or quit Discord.
 * Use External Emoji: Members with Discord Nitro will be allowed to use emoji
   from their other Discord server. Discord bots may specifically need this to
   implement some of their functionality.
 * Add Reactions: Allows users to add new reactions to a message. Members do not
   need this permission to click on an existing message reaction.
 * Connect: Allows users to join voice channels.
 * Speak: Allows users to speak in voice channels.
 * Video: Allows users to use Discord Video.
 * Use Voice Activity: User voice will be automatically detected when they are
   speaking. With this permission disabled, members must use push to talk to be
   heard in a voice channel.


CHANNEL OVERRIDES

If you have the Manage Channel Permissions permission, you can also set
channel-level permission overrides for both roles and individual members. By
changing one of the permissions to a checkmark or an X, you can define specific
permission for that channel for specific members or roles that will be used
instead of the permissions that are set on the Roles management screen. The
permissions for channel overrides are similar to their role-level counterparts,
but the descriptions will provide you with additional information as to exactly
what they do. You can learn more about navigating the channel permission
interface here.

However, it’s not enough just to know what buttons to click. It is vital that
you understand how Discord determines what permissions a user actually has when
you start involving channel overrides. Discord checks each permission in the
following order and stops if there are any overrides set at that step.

 1. Channel overrides for a user.
 2. Channel overrides for a custom role (not @everyone).
 3. Channel overrides for the @everyone role.
 4. Otherwise, use the role permissions.

This is fairly logical, but there is one important thing to note: if there are
conflicting channel overrides on different custom roles and a user has two or
more of these roles, Discord resolves this by allowing the permission.

For example, let’s say you run a server for World of Warcraft with a
#guild-recruitment channel, where people can advertise their guild. You set a
channel override to deny Send Messages to @everyone, then add another channel
override for your Recruiter role to allow Send Messages. This way, only people
with the Recruiter role can send messages in the channel. However, you also set
a channel override to deny Send Messages for your Muted User role. If you have
to mute one of the people with the Recruiter role by giving them the Muted User
role, they will still be able to send messages in the #guild-recruitment channel
because they have the Recruiter role. In this case, you have three options:

 1. Set channel permission for the Muted User role to deny View Channel, so that
    the member can’t access the channel to send messages in the first place.
 2. Set a user-level channel override to deny that specific user Send Messages.
 3. Remove their Recruiter role for the duration of their mute and give it back
    to them when it’s over.

You can use whichever method is easiest for you depending on any Discord bots
you are using and how your server is setup. However, this example is just one
way that channel overrides can conflict with each other. Be mindful of the way
you set your role permissions and channel overrides to use as few channel
overrides as you can so that you can avoid unexpected conflicts.


OTHER TECHNICAL CONSIDERATIONS

“CHAT MUTING” USERS

As mentioned in the channel override section, one common use of permissions is
to prevent certain users from sending messages to your server by giving them a
Muted User (or similarly named) role. This is most easily accomplished by using
a Discord bot to administer the role. The bot will first have a “mute role” and
then for every channel in the server set the channel permissions for that role
such that users with that role are not allowed to send messages or add
reactions. When you mute a user through the bot, it assigns them that role and
thus prevents them from interacting in the server’s channels.

It is also possible to set this up yourself and then manually assign the mute
role to users that need to be muted from chatting.

MODERATION BOTS

A lot of moderation on Discord is done using bots. You can find plenty of them
by doing some research online. Some options include MEE6, CarlBot, Zeppelin,
Dyno, GiselleBot, Gaius, and more. 

INVITING BOTS

To invite a bot to a server, you must have either Administrator or Manage Server
permission


SUMMARY

The goal of this article is to familiarize you with Discord permissions and
moderation actions so that you can more effectively moderate or manage your own
server. Now that you’ve finished reading this, hopefully, you have a better idea
of how to navigate Discord’s menus and manage your members and messages.


HANDLING DIFFICULT SCENARIOS AS AN ADMIN



moderator. This article will be going over a few things relating on how to take
action as a moderator. We’ll talk about the things you should consider in
situations that require a moderator, the first steps you should take in common
scenarios, and touch on what punishments are appropriate. If you’re unsure of
the tools available to you and how they work, consider reading Permissions on
Discord first.


STEPS TO RESOLUTION

There are a few “genres” of things you tend to see on a daily basis, depending
on your community. Keep in mind that every situation is unique. Because of this,
you may find it difficult to understand what exactly you should do in each
different scenario. If you find yourself in one of those situations, here are
some good points to follow for nearly every situation:

 * Situation Identification
 * Is something happening?
 * Does this need a moderator?
 * Information Gathering
 * Context
 * Motives
 * Initial Response
 * De-escalation
 * Proportional Response
 * Situation Closure
 * Informing other staff
 * Stating a message where the issue occurred
 * E.g."Sorry about that, situation handled!"

In some scenarios, steps 2 and 3 can be interchangeable or simultaneous.
Sometimes the context and motives are immediately apparent with the action, such
as a user’s intent to cause disruption by spamming gore in your server. You can
see right away that no additional context is needed and that their motives are
demonstrated clearly, so you can go right to proportional response. In this
case, the user is typically banned and reported to Discord’s Trust & Safety
team.


SITUATION IDENTIFICATION

There are two questions you should ask yourself whenever something catches your
attention:

 * Is something happening?
 * Does this need a moderator?

These questions are rather straightforward, but sometimes the answer may be a
little unclear. Typically a member’s disruption in the chat will catch your eye.
This disruption may be a variety of different things: they might be explicitly
breaking your server’s defined rules, treating other members harshly, bringing
the quality of your chat down through their behavior, or perhaps just a small
yet visible disagreement. If you confirm that something like this is happening,
you can then ask yourself the next question: Do I need to intervene?

When a member begins to disrupt your server, this member may need intervention
from a moderator to prevent the situation from escalating. However, while it may
be your first instinct to step in as a moderator when something happens, take a
step back and evaluate if that’s necessary. If two members have a disagreement
on a subject, this doesn’t always mean that the situation will become heated and
require your intervention. Disagreements are common not only on Discord but in
any sort of open forum platform where everyone can voice their opinion on
whatever anyone else says. Disagreements are a natural part of conversation and
can encourage healthy discourse. As long as a disagreement does not turn into a
heated argument, disagreements tend to be mostly benign.

There are, however, also cases that will require a moderator’s intervention. If
a situation seems to be escalating into harassment rather than simple
disagreement, or if members are posting things that break your server’s rules,
you can determine that it’s appropriate for you to intervene.


INFORMATION GATHERING

After you’ve confirmed to yourself that something needs your attention, you
should begin the next step of gathering information.

Before we get into that though, it’s good to note that there are certain
scenarios in which you would entirely skip this step and immediately move on to
the third step- involving de-escalation or handing down a corrective action.
Situations like these are ones in which you can tell right away that additional
context is unnecessary and that something needs to be done, typically
immediately. Situations like this could be:

 * Posting NSFW in non-marked channels
 * Posting gore
 * Mass spamming
 * Call to arms (Raiding threats, posting IPs and asking for DDoS, etc)

In cases like these, additional deliberation is unnecessary as the violations
are obvious. For more ambiguous cases however, you should consider the context
of the situation and the motives of the user.

CONSIDERING CONTEXT

Context is the surrounding circumstances of each situation. This includes the
events that happened before the incident, the interaction history of those
involved, the infraction history of those involved, and even how long they’ve
been in your server.

Consider the scenario where a user uses a racial slur. Some may think that the
user should immediately have corrective action taken against them, but that may
not be the case. This user could have been explaining an issue they run into in
the real world, or they could be asking someone else not to use the word. With
additional information at hand, it may become evident that the transgression is
less severe than initially thought, or perhaps even a non-violation at all. The
exact action taken will depend on your rules, but it’s clear that understanding
all of the relevant information is key to ensuring you take appropriate and
proportional action.

MOTIVES

Another thing to consider when you first approach a scenario is the underlying
motives of those involved. What are they trying to achieve? What is their goal
by doing what they’re doing?

For example, if two users are trading mild insults, it is possible to interpret
this as friendly banter if you know these two people are good friends.
Conversely, if you know these people dislike each other, then their motives may
be less than friendly. Knowing your members well will therefore help you better
to assess when a situation that needs intervention is occurring.

CONSIDERING CONTEXT

Context is the surrounding circumstances of each situation. This includes the
events that happened before the incident, the interaction history of those
involved, the infraction history of those involved, and even how long they’ve
been in your server.

Consider the scenario where a user uses a racial slur. Some may think that the
user should immediately have corrective action taken against them, but that may
not be the case. This user could have been explaining an issue they run into in
the real world, or they could be asking someone else not to use the word. With
additional information at hand, it may become evident that the transgression is
less severe than initially thought, or perhaps even a non-violation at all. The
exact action taken will depend on your rules, but it’s clear that understanding
all of the relevant information is key to ensuring you take appropriate and
proportional action.


CONSIDERING CONTEXT

Context is the surrounding circumstances of each situation. This includes the
events that happened before the incident, the interaction history of those
involved, the infraction history of those involved, and even how long they’ve
been in your server.

Consider the scenario where a user uses a racial slur. Some may think that the
user should immediately have corrective action taken against them, but that may
not be the case. This user could have been explaining an issue they run into in
the real world, or they could be asking someone else not to use the word. With
additional information at hand, it may become evident that the transgression is
less severe than initially thought, or perhaps even a non-violation at all. The
exact action taken will depend on your rules, but it’s clear that understanding
all of the relevant information is key to ensuring you take appropriate and
proportional action.


SITUATION CLOSURE

After you’ve dealt with a scenario, it may be appropriate to take action in
other places as well. Questions may arise from other members, your staff may
need to know about this incident in the future, or tensions may remain high
where the incident occurred.

INFORMING STAFF

It is important to log this incident with the other members of your staff for
future reference. There are many ways to do this, whether that be sending a
message in your private staff channel, logging it within a bot, or maybe posting
about it in your moderation log. These all provide you with a means to go back
and check the history of these users and their run-ins with staff. It is
important that you’re diligent about keeping these records. Other staff might
not know about the incident and similarly you may not be aware of other
incidents handled by your fellow staff members. If you find yourself in a
situation where the problem user causes issues in the future, you will be able
to quickly access the infraction history. This will allow you to appropriately
adjust your response to the situation and emphasizes the importance of context
when taking action.

TENSION RESOLUTION

Tensions may linger where the incident occurred. Other members may see what
happened and feel second-hand discomfort or anger depending on the situation. It
may be necessary to resolve this tension by thanking the other members of chat
for their patience and/or bringing it to your attention and stating that it was
solved. This has the side effect of answering where the users went and why it
happened.

For example, if two users had a heated argument in your chat and you ended up
muting them, third-party observers may see this argument in chat and react
negatively to the comments made during the argument. You can resolve this by
stating something along the lines of “Sorry about that everyone. Situation
resolved, users will be muted for a time to cool down.” This statement has the
effect of stating what you did and why you did it. Acknowledging the situation
as well as detailing that it’s been handled is an effective means to ease
tensions and bring healthy discussion back to your chat. Keep in mind though, if
the conversation has already moved on by the time you’ve dealt with the
incident, this step may not be necessary. Bringing the conversation back to this
issue may have the opposite effect and remind people of the uncomfortable
situation.


MOTIVES

Another thing to consider when you first approach a scenario is the underlying
motives of those involved. What are they trying to achieve? What is their goal
by doing what they’re doing?

For example, if two users are trading mild insults, it is possible to interpret
this as friendly banter if you know these two people are good friends.
Conversely, if you know these people dislike each other, then their motives may
be less than friendly. Knowing your members well will therefore help you better
to assess when a situation that needs intervention is occurring.


INITIAL RESPONSE

Now that you’ve confirmed both the context of the situation and the underlying
motives of the individual(s), you can decide what action you should take. Unless
you deem the conduct of a user to be notably severe, a typical initial response
is to de-escalate or defuse the situation. This means you attempt to solve the
situation by verbal communication rather than moderation action, such as an
official warning, a mute, or a ban.

DE-ESCALATION

When it comes to de-escalation, you should remember that the members involved
are typically going to be annoyed or upset at that moment due to the situation
at hand. If you approach the situation from a stern and strict stance
immediately, you could upset the members further and fan the flames, so to
speak.

An example of verbally mitigating an argument that's turning too heated would be
to say “Hey folks! While we appreciate discussion and think disagreement is
healthy for promoting productive discourse, we think this particular discussion
may have gone a little too far. Could we please change the subject and talk
about something else? Thanks!”

Now, consider what this statement aims to accomplish. It starts positive and
friendly, thanking the users for their participation on the server. Showing this
appreciation can help to calm the members involved. The message then states the
reason for the intervention. Doing this respectfully is important, because if
you aren’t respectful to your members, they aren’t going to be respectful to
you. This effect is amplified on community servers where you are going to be
interacting with the same active members on a regular basis.

After clarifying the reason for intervention, you should make the request on
what you expect to happen going forward. In this situation, this is asking the
members to move on. It’s important to note that phrasing the request as a
question rather than an order is a deliberate choice. The message thanks them
one more time as a way to end it on a positive note. Your goal here is to defuse
the situation so things don’t get worse. Keeping all of these things in mind
when you phrase your communications is important.

De-escalation is a skill that you may struggle with initially. Being comfortable
with it requires many different interactions and experiences with many different
moderation scenarios. Don’t be discouraged if you can’t do it immediately.
You’re going to run into scenarios where you simply aren’t able to effectively
defuse the situation and may have to rely on a corrective action instead. It is
still a very good idea to generally approach these situations without the intent
of punishing someone. Not every situation needs to end with a punishment. The
one skill that can take you from a good mod to an outstanding mod is the ability
to defuse situations swiftly and efficiently.

PROPORTIONAL RESPONSE

If you’ve tried to defuse a situation and they fail to listen, or continue to
escalate, your next step is deciding what other effective means you have to end
the situation at hand. So, what exactly should you do?

Most servers tend to follow a proportional response system. This means that
members tend to receive corrective action proportional to the acts they commit.
If we think about our situation where an argument got too heated and
de-escalation techniques were ineffective, we may want to consider restricting
the privileges of the members involved. This serves as a punishment that is
appropriate for the scenario while also allowing them the time they need to cool
down and move on. Other examples of where a mute may be appropriate are minor
spam, they are clearly inebriated, if a user is a little too harsh, or if
someone needs time to cool off. It’s important to note that an official warning
could also be given as an alternative which is typically done through a
moderation bot.

After you apply this mute, it is worth looking at the history of the members
involved in the incident to determine if the mute is all you need. If these
members have a history of being problematic in chat, you may consider removing
them from your community.

It’s important to remember that the goal of the moderation team is to promote
healthy activity in our communities. With this in mind, it’s also good to
remember that moderators and members are ultimately a part of that same
community and that you don’t want to intimidate the people that rely on you. If
you react too harshly, you run the risk of establishing a negative relationship
between you and your community. People in your community should feel safe
approaching you about an issue. Just like in the real world, they want to be
confident that if it ever comes to them being reported, they’ll be treated
fairly. If you’re scared of being banned from the server because of a small
disagreement, you tend to not want to engage with the server to begin with.

Inversely, if you don’t react strongly enough, you allow those who wish to
disrupt your community more time and opportunity to do so and you may not be
trusted by your community to handle situations.


SITUATION CLOSURE

After you’ve dealt with a scenario, it may be appropriate to take action in
other places as well. Questions may arise from other members, your staff may
need to know about this incident in the future, or tensions may remain high
where the incident occurred.

INFORMING STAFF

It is important to log this incident with the other members of your staff for
future reference. There are many ways to do this, whether that be sending a
message in your private staff channel, logging it within a bot, or maybe posting
about it in your moderation log. These all provide you with a means to go back
and check the history of these users and their run-ins with staff. It is
important that you’re diligent about keeping these records. Other staff might
not know about the incident and similarly you may not be aware of other
incidents handled by your fellow staff members. If you find yourself in a
situation where the problem user causes issues in the future, you will be able
to quickly access the infraction history. This will allow you to appropriately
adjust your response to the situation and emphasizes the importance of context
when taking action.

TENSION RESOLUTION

Tensions may linger where the incident occurred. Other members may see what
happened and feel second-hand discomfort or anger depending on the situation. It
may be necessary to resolve this tension by thanking the other members of chat
for their patience and/or bringing it to your attention and stating that it was
solved. This has the side effect of answering where the users went and why it
happened.

For example, if two users had a heated argument in your chat and you ended up
muting them, third-party observers may see this argument in chat and react
negatively to the comments made during the argument. You can resolve this by
stating something along the lines of “Sorry about that everyone. Situation
resolved, users will be muted for a time to cool down.” This statement has the
effect of stating what you did and why you did it. Acknowledging the situation
as well as detailing that it’s been handled is an effective means to ease
tensions and bring healthy discussion back to your chat. Keep in mind though, if
the conversation has already moved on by the time you’ve dealt with the
incident, this step may not be necessary. Bringing the conversation back to this
issue may have the opposite effect and remind people of the uncomfortable
situation.


SUMMARY

You should now be able to confidently approach each situation and determine what
the best way to handle it is. That being said, this is just a portion of your
foundation. First-hand experience is invaluable and necessary in order to be
more efficient and fluent in moderating.

One of the most undervalued tools in moderation is your voice as a person in a
position of power and your ability to defuse a situation, so don’t be afraid of
trying to mitigate a situation first. If you’re still in doubt about what to do,
never be afraid to ask your other staff members, especially those who may be
more experienced.

Remember: Situation identification, information gathering, initial response, and
situation closure. Keeping these steps in mind will help you stay on track to
becoming a better mod and better community lead.


MODERATING SAFELY AND SECURELY



Moderator: the title you give to people who have the responsibility of managing
your chat and community. Answering the call to protect your community and its
members at all costs, they are integral to any successful Discord server. But
it’s important to remember that moderators have to be safe online, just like the
users they fight to protect. The first step in doing this is to ensure your
account safety is set by having a strong password and setting up backup login
methods- all of which you can learn more about in this article that is going to
be focusing on the importance of securing your Discord account.

In this article, we’ll explain how moderators can do their job safely and
securely, cover how to handle links, scams, and possible doxxing attempts, and
introduce some general best practices to keep you and your account safe.


NATIVE FEATURES TO FIGHT SPAM

Spam has historically been a problem that plagues all platforms online as it is
a simple way to troll as well as easy to change and adapt to suit the spammer’s
needs. Discord has begun to implement progressive changes to how they detect
spam on the platform, updating, tweaking, and fine-tuning their anti-spam
systems daily to catch more and more spammers and spam content.

Firstly, we’ve implemented the Malicious Link Blocker, which is a system that
warns a user similar to what you see with Chrome when visiting specific sites.
It is meant to minimize exposure to spam content, but it’s important to remember
that it doesn’t catch everything. Keep in mind, just because a link does not
trigger the Malicious Link Blocker doesn’t mean that the link is safe! Always be
careful when clicking links from unknown users that may seem suspicious.




What the Malicious Link Blocker looks like in action.

Discord also introduced another anti-spam feature, which auto-detects and hides
content from likely spammers in servers reducing outright spam. These messages
will automatically be hidden by the Server Channel Spam Redaction system.


HOW TO HANDLE MALICIOUS CONTENT 

When you take on the title of community moderator, you become a front-facing
member of your server. As a result, you have to be up-to-date on the newest
methods on how to moderate safely and securely to not only keep you and your
team safe but also to help educate your community. This includes knowing how to
spot and handle malicious links, files, scams, and phishing attempts. It helps
to also have knowledge in how to deal with threats to your community members and
doxxing concerns. 

Now we’ll explore how to safeguard against these types of risks.

SPOTTING PROBLEMATIC LINKS AND FILES 

As a moderator, you might come across malicious content shared in your server in
the form of links and files. Malicious links and files come in all shapes and
sizes. Some try to get ahold of your account credentials, such as login
information or token while others try to have you download malware that can harm
your computer. 

If you do not recognize the domain, try doing a google search to find out more
information about the link before clicking on it. Some links try to imitate real
websites to trick the users into thinking it is safe to click on when, in fact,
it is a malicious link. Be sure to double-check the spelling of common domains
so that you aren’t tricked into thinking a link goes to YouTube instead of
“YouTbue”, for example. A more subtle way you might encounter malicious links is
through embedded messages from bots or webhooks. Unlike normal users, bots and
webhooks can hyperlink text in a message. Be sure to double check where a link
leads to when clicking on it. 

For example, you can encounter an embedded message that looks like
https://discord.com, but is hyperlinked to another site. The link in the
previous message doesn’t actually go to the Discord Moderator Academy that is
usually found at that domain, but is hyperlinked to another page. This is one
way attackers can mask their malicious URLs. 

Another thing to keep an eye out for when looking for malicious links is the
usage of URL shorteners that might hide a malicious domain name. For example,
using a URL Shortener website.

Although URL shorteners are a convenient way to make links more compact and
easier to read, they also hide the final destination of the URL, which could be
a malicious website. When dealing with these types of shortened URLs, you should
first prioritize determining where it leads. You can use a URL expander such as
URLScan or Redirect-Checker to do this. Once you have a better idea of what is
on the other side of the URL, you can decide whether it is safe or not to visit
the site and remove and/or report the message if need be. 

As a rule of thumb, it is never a good idea to follow links sent by strangers!
If you still are unsure about the destination of a link, you can use sites like
VirusTotal to check for any potential malware or illicit content.

You should always exercise caution when downloading files from anyone on
Discord, whether it’s from a stranger or someone you think you know. One of the
most dangerous files is a “.exe” file. These files will execute some sort of
function on your computer, leading to leaking information to the sender or
having other serious consequences. 

In some cases, downloading a malicious file won’t immediately affect your
computer until the file or program is run or opened. This is important to keep
in mind since downloading a file can cause a false sense of security to think it
is safe since “nothing bad happened” until you run whatever you downloaded!

If you do decide to trust the download, take the extra precaution to run it
through VirusTotal or similar websites to search for potential dangers. It’s
also good to check your anti-malware software to scan these files. To be extra
sure you don’t click anything illicit but want to run the message through one of
these websites, right-click the message on Discord and choose “Copy Link” from
the dropdown.

If you encounter misspelled links and other sketchy-looking links, it might be a
good idea to add it to a text filter or to your moderation bots’ banlist. If you
are sure that a link sent on your server is malicious or dangerous, be sure to
remove that user from your server so they cannot privately try to spread these
links to other users, and make sure to report it to Discord using the online
form.


RECOGNIZING SCAMMING AND PHISHING ATTEMPTS

Scammers use many different techniques to trick you into giving them your
personal information. They may try to steal your Discord login credentials, user
token, or private information through carefully crafted scam attempts, thus
giving them access to your account for problematic purposes.

SOCIAL ENGINEERING 

Phishing is when a scammer convinces you to do anything that provides them
access to your device, accounts, or personal information. They can more easily
infect you with malware or steal your personal information by impersonating
people or an organization who need this information. An example of this is a
scammer claiming to be a Discord Staff Member or claiming to be from official
Discord programs such as Partners or HypeSquad Events. Some more ambitious
scammers could also include someone claiming to be from local law enforcement. 

‍



‍

It is important to know that Discord Staff will only ever communicate through
accounts with a staff badge or through System DMs. We will never ask you for
your password. A Discord System DM will look exactly like the photo above in
your direct message inbox. Check out their Discord System Messages blog post for
more information about how Discord sends direct messages.

These social engineering tactics "bait" you with a trusted looking icon or name
to obtain your personal information. These schemes may persuade you to open an
attachment, click on a link, complete a form, or respond with personal
information to make it easier for them to steal your account.

COMMON SCAMS AND RED FLAGS 

Scams are constantly evolving and changing, but they do tend to follow similar
patterns. Remember, Discord will never ask you for your password, even through
official means of support, nor will we give away free Discord Nitro through
bots. Some common scams on the platform that are combatted every day are as
follows: 

Prize Scams. If it’s too good to be true, it probably is. Scammers might try to
get your information through empty promises of fake prizes. A common prize scam
is random bots sending you a message that you’ve won a month of Discord Nitro.
If these bots are not directly connected to a server giveaway you were a part
of, this giveaway is likely fake and and the links they sent are malicious.
Discord would never use a bot to send this information to you directly, and even
verified bots can be hacked to share these malicious links.

Steam Scams. Has someone ever sent you a message apologizing for “accidentally
reporting you” on Steam? This is yet another way scammers try to infiltrate your
accounts. Referring to someone who can fix the issue along with a link that
looks like Steam’s website, but in truth, is a phishing link. If you look
closely, you can spot typos in their domain name such as “steamcomnmunity,”
“sleamcommunity,” and many others. 

Most companies usually handle support issues on their websites, so be on the
lookout for anyone claiming to want to help you through Discord representing a
company or a service. Regarding the above example, Steam will always use its
platform to resolve relevant issues and never reach out through Discord to
settle problems with your account.

Game Scams. Be aware of random users who message you asking if you want to test
their new game. This is another attempt to compromise your account and unlock
your private information through phishing. Requests from strangers or friends to
try their game usually mean that their account has been compromised, and they
are now attempting to gain access to yours. If you have other means of
contacting this user off-platform, it is good to alert them to the fact that
their account has been compromised to see if they can regain control of it or
contact Discord Support about the issue.

Discord Recruitment Scams. Another type of scam is where external individuals or
companies pretend to represent Discord and offer fictitious job opportunities.
The scammer will try to impersonate a Discord employee either on Discord itself
or via external sites. This is a serious security concern, so there is a whole
page about this scam that you can read here: Discord Recruitment Scams. You can
only apply to their jobs through their official careers website. All
communication from Discord regarding hiring will come from discord.com or
discordapp.com email addresses. They will not use the platform to recruit you.


DEALING WITH THREATS & DOXXING ATTEMPTS 

With the vast array of search tools and information readily available online,
almost anyone can be a doxxing victim. If you have ever posted in an online
forum, signed an online petition, or purchased a property, your information is
publicly available. Through public records, databases, and other repositories,
large amounts of data are readily available to anyone who searches for it.

Cybercriminals and trolls can be incredibly inventive in how they doxx you. They
might start with a single clue and follow it until your online persona is
progressively unraveled and your identity is revealed. You must be hyper-aware
of what personal information you share online and be cautious when divulging
information about yourself. 

If private information about you is revealed online through Google searches and
you happen to live in the EU or Argentina, you have the right to be forgotten.
Similar rights are given to people in the United States, although not to the
same extent. We generally encourage you to check resources such as
HaveIBeenPwned to see whether or not your data has been a part of any big
leaks. 

If you want content about you to be removed from Google, refer to this Google
Troubleshooter. Sharing these resources or posting them within your Discord
server can prove to be a valuable asset to your members, forestalling possible
doxxing attempts or threats. Another great resource is the COACH tool which
helps you lock down your identity by portioning the basics of online security
into bite-sized, interactive, easy-to-follow guides.

If you are concerned you are at a high risk of being doxxed, you can consider
setting up Google Alerts to monitor possible doxxing attempts. If sensitive or
private information has been leaked online, you can submit requests to have that
content removed by using the following guides: Removing Content From Google or
Remove Your Personal Information From Google.

BEST PRACTICES OF CYBERSECURITY ON DISCORD

Keeping your Discord login credentials and account token safe and secure is
vitally important to ensure your own account safety when moderating an online
community. Even with proactive measures such as 2-Factor-Authentication (2FA) in
place, scammers can still get access to your account with your account token, so
evading common phishing attempts and utilizing the vast amount of resources
available to spot scams becomes increasingly important for moderators. Discord
released an article about keeping your account safe and sound with 2FA, which is
an excellent resource to read or refer to.

Ensuring that server permissions are set up correctly is also essential to
combat illicit links and other variations of phishing and scamming attempts. It
is important to double-check your permissions when making new categories or
channels inside your server, as moderators discuss sensitive and private
information inside locked moderation channels. If you need a refresher on how
permissions work, check out this article here.

Bots are potent tools that moderators and community builders use daily to help
moderate and spark community interest via events and games. However, bot
accounts differ slightly from a regular user account, meaning that bot accounts
are capable of performing actions much faster than a regular user and allowed to
obtain the message and user data from your server quickly.

Knowing what permissions bots and bot roles are given is essential to developing
a safe community, helping ensure the safety of all its members and its
moderators. A malicious bot can wreak havoc within servers very quickly by
mass-deleting categories, exposing private channels, sending scam messages, and
abusing webhooks. We heavily recommend researching any bot before adding it to
your server.


CONCLUSION

When reporting content to Discord, you might hesitate and think to yourself, is
this worth reporting? Please know that all reports made in good faith are worth
reporting to Discord. Moderating on Discord is an integral part of the platform.
User safety is the number one priority for Discord, especially moderators, as
they help keep your community safe. 

There are a lot of resources to draw from to ensure you moderate safely and
securely. Practice good cybersecurity by having good antivirus and malware
detection programs and strong passwords. Differentiate between your “real” self
and online persona to minimize doxxing opportunities. Check suspicious links and
websites through online tools to make sure they aren’t malicious. If you or one
of your community members are doxxed online, there are proactive and reactive
measures that can be taken to ensure your account security. Figure out what sort
of content was leaked, report it to Discord’s Trust & Safety teams, and submit
relevant removal requests such as Google’s removal tools.

We hope these tips help you in your moderator journey!


HOW TO MODERATE VOICE CHANNELS



Voice channels are a great way for your server members to engage with each other
and can be utilized in many different ways. You might have a system for users to
find other users to play games with or maybe you just have some general chat
channels to hang out and listen to music. Regardless of the types of voice
channels you have, you're going to need to moderate these voice channels, which
can reveal some interesting challenges. In this article, we'll identify these
challenges and go over some things that can help you to overcome them.


WHY MODERATE VOICE CHANNELS?

This article will focus on how to moderate voice channels in servers where voice
channels are used frequently and without moderators present. However, the
information in this article is still useful to any server with voice channels.
Although these situations may be rarer for your server, it doesn't hurt to be
prepared if they do indeed occur.


WHAT TO LOOK OUT FOR

Many of the moderation issues you will encounter while moderating voice channels
will be the spoken equivalent of situations you would encounter in text
channels. However, you can't keep records of what is said in voice channels
without recording everything with a bot, which is not easy to do, nor is it
something your server members will likely be comfortable with. This means that
if no moderator is present in the voice channel at the time of a user being
troublesome, you will likely hear about the situation from a user who was
present for it. We will discuss best practices in handling these user reports in
the next section of this article. There are also a few situations specific to
voice channels to be aware of.

Common situations that would require moderator intervention that might occur in
voice channels are as follows:

 * A user is being rude or disrespectful to a specific user
 * A user is saying discriminatory phrases or slurs in a voice channel
 * A user is playing audio or video of Not Safe For Work content
 * A user is rapidly joining and leaving a voice channel or multiple voice
   channels (voice hopping)
 * A user is producing audio shock content (loud noises intended to startle or
   harm a listeners’ ears)


RISK MANAGEMENT FOR VOICE CHANNELS

Before we even consider how we plan to moderate situations that may arise in
voice channels, let's discuss ways to prevent them from happening in the first
place, or at least make it easier for us to deal with them later. One of the
easiest and most useful things you can do is set up voice channel logging.
Specifically, you can log when a user joins, leaves or moves between voice
channels. Many moderation bots support this type of logging.

Having voice logs will allow you to catch voice hoppers without having to be
present in a voice channel. It will also prove useful in verifying reports from
server members and ensuring users can't make a false report about other users
who weren't actually present in a voice channel at the time. Another thing you
can do to prevent trolls from infiltrating voice channels (and every other part
of your server) is having a good verification gate.


HANDLING REPORTS FROM YOUR SERVER MEMBERS

What do we do when a member of the server reports rule-breaking behavior in a
voice channel, but no moderator was there to witness it? If we believe them and
treat their word as fact, we can take care of the situation accordingly.
While this may work for certain situations, there is the possibility that
troublesome users may realize that the moderators are acting on all reports in
good faith and begin to try to take advantage of this policy and create false
reports. This is obviously very problematic, so let's now consider the opposite
scenario. If a moderation team doesn't believe any of the reports and moderate
situations only when a moderator is present, it's likely that the troublesome
user can keep getting away with their rule-breaking behavior. In some cases,
even if a moderator is available to join a voice channel when they receive a
report, they might find that the troublesome user stops their behavior when the
moderator joins, thus making it impossible to verify the report. This can be
partially mitigated by moderators using alternate accounts to join the voice
channel and appear as a user, but ultimately there will be situations where mods
aren't available and reports will need to be considered.

In general, any singular report should not be believed based on the report
alone. When a user reports a situation in a voice channel, the following
questions should be asked:

 * Has the user made reports in the past? Were they legitimate?

Active users who make many legitimate reports can likely be trusted. The more
legitimate reports a user has made, the more likely it is that they can be
trusted. Even if a trusted user makes a false report at some point, it is often
easy to undo any false actions taken.

 * How long has the user been a part of the server and how much have they
   contributed?

Positive contributions in your server (such as quality conversations or being
welcoming and supportive of other server members) is also a way that members
might gain trust. This trust is something that can be handled the same way you
handle trust gained from legitimate reports.

 * Did multiple users report the situation or can others who were present
   confirm the report? If so, do these users have any sort of connection to each
   other? Could they be the same person on multiple accounts?

If multiple users report the same issue, and you know they are not connected,
the report can safely be trusted as long as the information in the reports are
consistent with each other. Knowing when users are connected can be difficult in
some cases, but some signs you can look for are: users who joined the server at
the same time, users with IDs close to each other (signifying similar account
creation times), similar behavior or patterns of talking, or interactions with
each other which signify the users know each other from somewhere else. It is
important to ensure that this isn’t an organized effort from a group of friends
or alternate accounts targeting another user.

 * Does the report seem plausible? Is the person being reported someone you
   expect would cause trouble?

There are many things you can look at when examining the user being reported.
Some things to look out for are offensive or inappropriate usernames, profile
pictures, or statuses, and any inappropriate messages sent by the user in the
past. Inappropriate messaging can be anything from spam to rude or offensive
behavior or even odd or confusing behavior.

If the answers to these questions leads to more skepticism and questioning of
the legitimacy of the report, then it may be the case that the report can't be
trusted. If you are not confident in the report's legitimacy, you should still
make a note of the report, in case similar reports are made in the future.
Repeated reports are one of the most common ways to discover that a report is
likely legitimate and allow you to make a better informed decision later on. It
may also be the case that the answer to these questions reveals that the report
is likely illegitimate, in which case you may punish the reporter(s)
accordingly. Intentional false reporting is a breach of trust with a moderation
team, but reports made in good faith that are unactionable should not result in
punishment.


HANDLING SEVERE SITUATIONS

Occasionally, you might find yourself faced with a difficult, time-sensitive
situation where someone is at risk of being harmed, or harming themselves.
Because nothing is recorded in voice channels, it is not possible to report
these types of situations to Discord. If you witness situations such as these,
or if you receive reports of them, you should reach out to those involved in DMs
or a text channel in the server. You can also attempt to dissolve the situation
in the voice channel if you are confident in your abilities to do so, but it may
be harder to get the authorities involved if it is necessary if there is no
evidence of the situation in a DM or text channel. Once you have done that,
report the user(s) to Discord. Whether or not you are able to move the situation
to a DM or text channel, call your local authorities if you believe someone is
in imminent danger. If you know the area where the person in danger lives, you
may also want to call the authorities in their area.


CONCLUSION

Handling situations in voice channels can be difficult, but with the right tools
and protocols in place, your servers’ moderation team can be prepared for
anything. After reading this article, you should have a good understanding of
when voice moderation is needed, and how to properly enact voice moderation
scenarios. This article outlined some of the most common situations that you
should look out for, as well as how to prepare for some situations proactively.
It also showed how you can handle user reports in a way that minimizes the
possibility of actioning false reports. Finally, you learned how to handle
severe, time-sensitive situations, where someones’ life may be in danger.
There's a lot to consider when you're moderating voice channels, but by
following these tips, you should be well-equipped to moderate voice channels in
your server!


MODERATING SENSITIVE TOPICS



Permitting the discussion of sensitive topics on your server can allow users to
feel more at home and engage with their trusted peers on topics they may not
feel comfortable discussing with others. This can encompass subjects like
politics, mental health, or maybe even their own personal struggles. Having
dedicated channels can keep these topics as opt-in and in a dedicated space so
that people who do not want to see this content can avoid it. This can also
allow you to role gate the channel, making it opt-in, level gated, activity
gated, by request only, or some other requirement to keep trolls or
irresponsible users out.


ALLOWING SENSITIVE TOPICS IN YOUR COMMUNITY

Establishing channels dedicated to sensitive topics can also be an exhausting
drain on your moderation team and can invite unwanted content into your server.
These channels can quickly get out of hand if they are not set up mindfully and
moderated carefully and will often require their own sets of rules and
permissions to be run effectively and safely. Whether you want these discussions
to occur in your space at all is up to you and your team. Having channels for
these topics takes a lot of work and special consideration for you to determine
if it’s the right fit for your server.

In short: This document will serve to educate you on how best to discern if you
want these different channels, whether it be a channel on venting,
serious-topics, or a real world event. Keep in mind- no matter what topics (if
any) that you decide to include in your server, remember that all content needs
to be within Discord’s Terms of Service and Community Guidelines.


DETERMINING WHAT IS A SENSITIVE TOPIC

The first step to determining whether to have sensitive topics channels in your
server is to define what is considered a sensitive topic for your community. If
you are running a server for people from a specific country, a discussion of
that country's conflicts with other countries may be a sensitive topic.
Conversely, if you are running something like a political debate server, that
same topic can be relatively non-problematic and not upsetting to the members of
the server.

There are two main types of sensitive topics: triggering topics and contentious
topics. A triggering topic is a topic or word that can prompt an increase or
return of symptoms of a mental illness or mental distress due to trauma. A
contentious topic is one that is controversial that has the potential to provoke
heated arguments.

While sensitive topics can vary depending on what kind of server you own (e.g. a
mental health server vs. a gaming server), keep in mind that there are topics
that can be considered triggering and topics that can be considered contentious
in most, if not all public spaces.


TRIGGERING TOPICS

Triggering topics can vary wildly from community to community depending on what
the focus of the community is. For instance, in a community for transgender
people, in-depth descriptions of a body or the discomfort some people experience
because of their body is likely to be a triggering topic. There are some
triggers that are very common and should be handled with the assumption that
they will cause multiple people in your community to feel uncomfortable or even
traumatized regardless of what type of community it is. This would include
things like sexual assault, graphic depictions of violence, other traumatic
experiences, suicide and self harm, eating disorders, parental abuse or neglect,
etc. These more sensitive topics should likely be separated out from more
community specific topics that have the potential to invoke trauma such as
transitioning or coming out in a server for LGBTQ+ people.


CHANNEL NAMES

Channel names indicate to users the intended purpose of the channel. Carefully
choosing your name can have a large impact on what the channel ends up being
used for. For example, #personal-questions-and-advice versus 
#tw-emotional-support-and-venting give users very different impressions of what
the channel is for. If you want a channel where someone can ask “What are some
ways to distract myself if I feel like hurting myself” or, “my teacher is being
homophobic, what should I do?” but not graphic descriptions of the symptoms of
trauma or vice versa, make sure the name of the channel reflects that. Including
tw (trigger warning) or cw (content warning) in your channel name will give the
impression that the latter is allowed and is what the channel is intended to be
used for.


CHANNEL RULES

Channels that have the potential to bring crisis situations into a server or
cause distress to other members of the community should have specific rules to
minimize the potential harm. These rules could be pinned in the channel, have
their own channel within a category that houses sensitive topics channels, or be
included in the servers rules. The example list of rules below includes some
harm mitigation strategies, as well as the potential downsides of each.

 * Consider asking that anything extremely upsetting utilize a trigger warning
   (tw) or content warning (cw). A trigger/content warning gives users a heads
   up that the content they are about to look at has the potential to invoke
   mental distress or trauma. This allows the user to decide whether to look at
   it or avoid it. This can be done through adding spacing to messages to push
   content off screen so the warning is seen first, or utilizing spoiler tags to
   hide content from anyone who does not want to see it. Including an emoji with
   a spoiler tag can also help the message stand out. For example:


 * Consider removing permissions for images/link embedding. Sometimes users will
   post images of self harm or weapons they intend to use on themselves in
   emotional support channels.
 * This can prevent users from uploading advice in the form of screenshots from
   other online resources.
 * Consider banning certain topics. If your moderators and server members aren’t
   professionally qualified to offer sound advice on a topic, it may be worth
   disallowing discussion of those topics. Saying the wrong thing to a person in
   crisis can make the situation worse. If a topic arises, the conversation can
   be ended at “We’re not qualified, here are ways to contact people who are.”
 * This can be seen as callous or upsetting to other server members, or the
   member in crisis. While putting an end to a discussion related to one of your
   members in a crisis will be uncomfortable, it may still be an effective and
   safe way to deal with situations that require expertise beyond that of your
   mod team or other server members.
 * Consider creating a role or using channel permissions to allow you to ban
   people from the channel entirely if they seem to be incapable of following
   the channel rules but are otherwise a positive member of the community and
   follow the server wide rules.
 * Adding a role to a user can draw attention to and further punish the user.
   Other users may see and inquire about the unfamiliar role. Adding channel
   permissions allows more privacy but will clutter your permissions and likely
   not be very scalable if this option is used frequently.
 * Keep a list of resources on hand that can be shared if someone mentions that
   they are in the midst of a crisis situation, this should include abuse
   hotlines, self harm/suicide hotlines and hotlines for parental abuse. Be sure
   to seek out hotlines for the country that a majority of your users are from,
   or international hotlines that anyone can use.


MODERATION CONCERNS

 * Emotional burnout from dealing with users in crisis or users asking for
   advice about upsetting personal issues can be detrimental to moderators.
   Whether moderators are actively engaging with users in the chat or just
   reading the chat to ensure it is not getting out of hand, the emotional toll
   is high. Moderators who engage with and moderate these spaces should
   understand their limits and how to manage burnout.
 * Moderating users who are in a crisis or really going through it is unpleasant
   and can make the staff team look harsh or uncaring to other users regardless
   of how egregious their behaviour is.
 * The chance for abuse in these channels is higher than the average channel.
   Users who overuse the channel and are in constant need of support/advice can
   quickly become a drain on the emotional well being of everyone involved with
   the channel. Know the red flags for emotional abuse and keep an eye out for
   users who are constantly in crisis and trying to manipulate users into doing
   things for them.
 * Trolls and malicious attention seekers will target this channel. They will
   come in with extremely upsetting sob stories and fake mental/physical health
   crises to make users panic, upset people, or just generally disturb the
   peace. Allowing them a space to soap box makes these sorts of users more
   difficult to pick out and remove before they can start doing damage.
 * Some users will intentionally seek out content that they know will trigger
   them as a form of emotional self harm. It can be difficult to know whether
   this is happening in your server unless someone explicitly mentions that they
   are doing it.
 * If there is a separate team in charge of moderating or overseeing these
   channels, they will need to closely communicate with the rest of the
   moderation team about problem users or concerning behavior.

CONCLUSIONS

Channels focused on sensitive topics can provide users with a comfortable space
to discuss personal issues of varying severity and build closeness and trust
between members of your community. These channels also have very specific risks
and required mitigation strategies that will vary depending on the nature of the
specific channel. If you are running a channel on transition advice for
transgender users, your main concern will likely be fake advice about foods that
change hormone levels or dangerous advice regarding illegally acquiring
hormones. If you run a channel for sexual assault victims, your main concern
will likely be victim blaming and ensuring that users reach out to professionals
when needed. You have to consider what the specific risks in your channel are
and ensure that you are writing policies that are specific to your needs and
finding moderators that are knowledgeable and comfortable with those topics.


CONTENTIOUS TOPICS

There may be contentious topics for your community in particular, but in general
politics, economics, law, current events, and morality are contentious topics
for most servers. These topics are likely to cause disagreements as a lot of
users will have very varied and very firm opinions on the topics.

CHANNEL NAMES

A channel named #discussion-of-current-events and a channel named
#political-debates-to-the-death are going to yield very different types of
interactions. If you want a channel where people can mention politics and
current events and discuss things like the stock market or a new law that was
passed, but don’t want discussions about whether specific world leaders are good
or bad, or what economic model is the best one, make sure your channel name
reflects that. Many/most users won’t read anything but the channel name, so your
channel name needs to set the correct expectation for the content individuals
will find inside.

CHANNEL RULES

Channels that have the potential to get heated and cause arguments that lead to
negative feedback loops should have specific rules to minimize the potential
harm. These rules could be pinned in the channel, have their own channel within
a category that houses contentious topics channels, or be included in the
servers rules. The list of rules below includes some harm mitigation strategies,
as well as the potential downsides of each.

 * Explicitly disallow bigotry and hate speech of any form. Remember that this
   behavior is also against Discord’s Terms of Service and Community Guidelines
   and shouldn’t be allowed in any situation on your server, sensitive topics or
   not.
 * Consider requiring that users provide sources to back up claims. This
   prevents users from trying to make others argue against complete nonsense
   that is only supported by fringe conspiracy theorists, or is demonstrably
   false.

 * Users who are acting in bad faith can abuse this and request that every
   statement ever have a credible source and bog down people who are citing
   credible sources they do not have on hand at the time.

 * Consider banning some topics. Some topics just shouldn’t be allowed to be
   debated. This could include dangerous conspiracy theories, triggering topics,
   nsfw topics, or topics that are particularly harmful to debate amongst the
   membership of your type of community.

 * Users may complain about free speech or that some of the banned topics
   shouldn’t be banned. They may also ask you to ban other topics that you may
   not think are really a problem.

 * Consider dis-allowing “devils advocate” style statements. If your channel
   doesn’t exist for arguing for the sake of arguing, don’t allow people to make
   arguments that they don’t believe in for the sake of stirring up more
   discussion.

 * Users may complain about free speech, or claim that they do believe in
   arguments. There’s no way to confirm whether people believe what they’re
   saying or not.

 * Consider setting up permissions to enforce a slower chat. Things like slow
   mode and temporary channel locks for “cool off” periods can help to keep
   things calm in the chat.

MODERATION CONCERNS

 * Moderation actions can look biased if a moderator is engaging in a
   conversation, disagrees with a user and then needs to moderate them for
   behavior in the same channel.
 * Moderators can be upset by the content of the channel, or the
   opinions/conduct of another user even if no rules are being broken and
   respond inappropriately.
 * Trolls and malicious users will target this channel. People will come in to
   spout stupid or offensive opinions to start arguments. They will also ping
   pong between extremely contentious topics until someone takes the bait. If
   the comments they are making and the general behavior of trying to start a
   debate about anything are allowed on the server, it will be more difficult to
   remove them before they get the chance to be disruptive.
 * Misinformation can be spread in the channel, moderators must have a good
   understanding of current events in order to prevent dangerous misinformation
   or conspiracy theories from being proliferated in their communities.
 * If there is a separate team in charge of moderating or overseeing these
   channels, they will need to closely communicate with the rest of the
   moderation team about problem users or concerning behavior.

CONCLUSIONS

Channels focused around contentious topics can provide users with an engaging
space to discuss topics with people from varied backgrounds and explore other
perspectives. These channels also have very specific risks and required
mitigation strategies that will vary depending on the nature of the specific
channel. For example, if you are running a channel on COVID19, your main concern
will likely be dangerous misinformation and conspiracy theories. If you run a
channel for the 2020 US Presidential Election, your main concern may be things
getting too heated or insult-flinging. You have to consider what the specific
risks in your channel are and ensure that you are writing policies that are
specific to your needs and finding moderators that are knowledgeable and
comfortable with the topics.


CURRENT EVENTS

Current event channels are for a single and specific current event, such as
COVID-19 or mourning a beloved server member. Depending on the topic of the
channel, it may also be a contentious or sensitive topic, but differs because it
is a narrowly focused and usually temporary space. These channels can be useful
to have if a topic either isn’t allowed per your servers rules, or is allowed
but is overwhelming conversations in other channels (but is otherwise not
something you want to outright ban the discussion of).

CHANNEL NAMES

Channel names for specific current events should be as clear as possible. For
instance, making a channel for the COVID-19 Pandemic and then naming the channel
“Diseases” makes little sense. Instead, you would want to be specific and name
it something like “COVID-19 Pandemic” or “COVID-19”. This ensures that your
users will at-a-glance understand what the channel is for. You should also have
a topic for the channel that helps inform users of its purpose. You may also
want to have something like “Read The Rules” in the topic, so users know to read
any additional rules. These rules could be pinned in the channel, have their own
channel within a category that houses contentious topics channels, if you have
multiple, or be included in the servers rules. Also keep in mind that users may
not always read the channel topic or pinned messages.

CHANNEL RULES

Channels covering current events should have rules that help promote healthy
discussion of these topics. While each real world event may be different, there
are some baseline rules / guidelines that we believe these channels should have.

 * Do not spread misinformation. A rule like this is incredibly important,
   especially for topics that relate to public safety (such as COVID-19).
 * Keep conversations on-topic. Making sure that conversations do not go too
   off-topic will let others jump in and give their own insight. If a
   conversation becomes too meme-y or off-topic, then it will be harder for
   others to jump in and it could turn the channel into an off-topic channel.
 * Be respectful and try to keep arguments to a minimum. Arguments in these
   kinds of channels will flair up, but it is important as a moderator to ensure
   they do not devolve into name-calling or personal attacks. If an argument
   does flare up, try to ensure that users tackle the arguments and ideas
   presented and not the user that presented them.
 * Encourage others to jump in and give their thoughts. There are usually many
   different viewpoints when it comes to real-world events. Especially ones that
   warrant their own channel. So it is good to encourage those with different
   viewpoints to chime in with their own points of view.

CONCLUSION

Channels like these can be difficult to manage. On one hand, you want things to
be contained and on-topic. On the other hand, you may want to allow for other
kinds of discussion that relate to the topic at-hand. Ultimately it is up to you
to decide how to best implement these channels. Whether the channel is for a
global pandemic, a friend passing away, a game releasing, or anything
in-between, these channels will require a special finesse that other channels
may not. It is our hope that these example rules and channel names can help you
create a space that adheres to a specific topic and creates an atmosphere that
is both respectful and engaging.


FOSTERING HEALTHY COMMUNITIES



While it can be easy to just decide to kick out rulebreakers and let everyone
else do what they want, moderation comes with some extra responsibilities that
lie outside the purview of upholding the rules. The actions you take can (and
will) shape the atmosphere of your community as a whole. Knowing how to maintain
a positive environment is extremely important. Think of a community as a garden
- it will blossom into beautiful colors if nurtured correctly, but it can
quickly shrivel away if you’re only focusing on getting rid of the weeds. This
article will go over how to maintain a healthy community atmosphere.


WHAT DOES “HEALTHY” EVEN MEAN?

The meaning of a “healthy” community differs widely from server to server, and
even from person to person. What’s healthy for one server may not actually be
healthy for another! For example, it wouldn’t be the best idea to run a
wholesome Animal Crossing fan group the same way as a Doom Eternal server.

Despite this, there are still many things that all healthy communities share.
Most notably, they foster meaningful conversations through a positive, welcoming
environment, all while maintaining a balance between fun and safety.

A NOTE ON QUALITY VS. QUANTITY

Many people assume that a community is “healthy” based on how many members it
has and how active it is. While that can be a factor, those numbers alone can’t
describe the quality of a community. The amount of activity may provide some
insight but without looking deeper, there isn’t a way to know for sure. A
massive, 500,000 member server might be flooded with multiple messages per
second, but this provides little information about the quality or atmosphere of
the conversations it contains.

Don’t be discouraged if the server doesn’t have a massive amount of members!
Small communities thrive just as well as large ones, and as a bonus, they’re
easier to moderate. The goal is to maintain a great place for like-minded people
to hang out, not to be the biggest or the most popular.


HOW CAN I TELL IF A COMMUNITY IS HEALTHY?

Many factors that indicate health of a community can’t easily be put into
numbers or displayed as data, so they may be difficult to understand without
taking the time to really observe your server in an in-depth way.

OBSERVE BEHAVIOR, CONVERSATIONS, AND INTERACTIONS

The very core of a community is based on how people interact with each other.
Observing and participating with members frequently - which you should be doing
already as a moderator - should give you a good idea of the general atmosphere.

Here are some questions to ask yourself:

 * Are these conversations frequently positive or negative?
 * Are people being respectful to each other and in general?
 * Are these conversations actually meaningful?
 * Do members often complain about things, even other than the server itself?
 * Do members feel included, or do they often struggle to be acknowledged?
 * Are members generally aware of the rules, and do they help each other follow
   them?
 * Do they usually contact moderators if someone is breaking those rules?

It is central to make sure that conversations have the potential to go deeper
than just “Hi, how are you” and other small talk. Deeper conversations foster
more friendships and make the community a comfortable place that people want to
come back to.

SERVER INSIGHTS

Discord provides all community servers with 500 or more members with an insights
panel, which has tons of valuable stats. The activation, retention, and
engagement charts it provides are awesome indicators of how things are going.
Note that they are based on percentages, not just raw amounts.

If your server is actively growing, these stats (located in the Growth and
Activation tab) will help you understand how new members are interpreting and
reacting to the community.

 * First day activation is when a new user participates or looks around on their
   first day.
 * Retention is when a new user continues to participate (or at least comes
   back) during the next week after they joined.



The orange lines crossing both charts indicate benchmarks that communities
should strive to surpass. They’re based on data from some of the best servers on
Discord, and are usually a good target no matter how many members your community
has!





CREATING AND MAINTAINING A HEALTHY ATMOSPHERE

Now that you know how to identify community health, it’s important to know how
to grow and maintain that health.

BE AN EXAMPLE FOR OTHERS

People generally notice moderators more easily than other members, especially if
they have a different role color that stands out. They will take note of how you
act, and since you’re in a place of authority, what you do indirectly affects
what they do. This shouldn’t need to be said, but a community cannot be healthy
when there are moderators who are disrespectful, break their own rules, or
encourage toxicity.

GUIDE CONVERSATIONS

Whether it’s bringing up a cool topic or steering away from a dark one, you have
many opportunities to gently guide conversations in meaningful directions. When
a subject is becoming sensitive, argumentative, or just downright wrong, try to
change the topic. If it continues, you should politely ask the people involved
to drop it, or to move to a different channel if one exists for it.

DON’T TOLERATE TOXICITY

Stomping out useless negativity, rudeness, and other toxicity is one of the most
important things moderators must do. This can be easier said than done,
especially if someone is being borderline but not quite breaking any rules. Many
moderators get confused and wonder what they should do in these kinds of
situations, and they may be afraid to do anything, as they think there’s nothing
to justify it. The truth is, if someone is knowingly making others
uncomfortable, even if they aren’t breaking any rules, it’s still the right
thing to take action on them - especially if they’ve already been warned
multiple times.


BALANCING FUN AND FREEDOM WITH SAFETY

Don’t go overboard and just ban sadness, though. Your community members should
be able to be themselves (within reason). Let them joke around, have fun, make
memes, make friends, support each other, develop inside jokes, etc... without
constantly breathing down their neck. Participate with them! While you should
always step in as a moderator as needed, that doesn’t mean that you have to
alienate yourself from the community.

This balance between being strict and lenient is important. There are many
things that must be enforced, but doing so without hurting community health can
be difficult. On the other hand, you don’t want to let your server get out of
control. You should definitely be strict with the most important boundaries
(such as disallowing spam, slurs, etc), but the others, depending on the
behavior of the community, are up to you.


WATCH OUT FOR ATTENTION-SEEKING BEHAVIOR

Occasionally, someone may join the server and constantly attempt to redirect all
attention to themselves. They constantly interrupt other conversations with
their own, become overly upset if they’re even slightly ignored, or tire
everyone out (whether intentionally or not). In some cases, they might
frequently tell others that they are depressed. While depression is a real
issue, it can be hard to tell whether someone is simply attention-seeking or if
they’re genuinely depressed.

The following warning signs can help you identify this behavior:

 * They frequently point out that they feel terrible, making sure that everyone
   knows about it.
 * They might not accept help, or they believe that help is useless and won’t
   work.
 * They may force others to be friends with them by threatening to be sad or to
   hurt themselves.

While receiving support from others is great, this kind of relationship is
extremely unhealthy for everyone involved. Consider gently approaching this
person and telling them that they’re not in trouble, but they need professional
support that a Discord server can’t always provide, like a therapist.

If someone shows signs of being suicidal, please spend time with them, and urge
them to contact a hotline. You should also contact Discord, who may be able to
send help their way if you can’t.

 * Suicide prevention hotline (1-800-273-8255)
 * List of other suicide crisis lines


SUMMARY

Community health is a tough thing to get right. It requires a lot of
understanding of your server’s members and what they enjoy (or don’t enjoy)
about being there. The actions you take can influence so much about the general
atmosphere, how welcoming the server is, and whether people want to spend their
time there. As a guardian of the community, your job is not only to kick out
trolls and toxicity, but also to nurture kindness, listen to feedback, and make
sure that everyone is having a good time.


MANAGING EXPONENTIAL MEMBERSHIP GROWTH IN YOUR SERVER



Servers tend to have an ebb and flow in the rate of incoming new users. There
are outlying factors that affect the rate of new joins for all servers including
holidays, breaks in school, and times of year when Discord picks up new users at
a higher rate. There are also times where specific communities will see a rapid
increase in new joins. This can look different for different types of servers.
For verified servers, this could be immediately following the content creator or
game developer that the server is based around plugging the server on Twitter,
hosting an AMA, or hosting events or tournaments within the server; for LGBTQ+
servers, this could be during and preceding Pride Month; and for gaming servers
this can come from new updates being pushed or leaks being revealed.

In addition, any servers that are on Discord’s Server Discovery will potentially
see a huge increase in their join rates depending on their focus and the
keywords they have in Discovery, especially if they are featured on the front
page by Discord. Popup event servers that are being set up temporarily or
servers that are mentioned by people with large followings will also see a huge
influx of users joining very rapidly from whatever avenues the server was
advertised through.  

Growth can be difficult to manage at normal rates, and explosive growth can be
especially difficult. Teams will need to prepare for influxes of users, manage
their entry into the space, ensure that they can integrate with and navigate the
different parts of the community during a more turbulent or hectic time, and
maintain the purpose and mission of the community with minimal disruption.
Throughout all of this, moderators will also have to deal with an increase in
the volume of bad actors joining and the increase in required moderation actions
to keep things running smoothly.  

This article will serve to educate you on how best to handle and prepare for a
significant increase in your server growth over a short period of time, whether
it be for a temporary event space or for an existing server that’s getting a
boost.


INTERNAL COORDINATION

You should check in with your moderation team before large events to ensure they
are still on board and understand that they will be experiencing an increased
demand for their time and attention. Avoiding burnout should be one of your main
goals in evaluating your staffing; if you are already short-staffed or your
staff members are juggling just as much as they can handle, you need to be aware
before you are dealing with thousands of new users.

Doing this is equally important for long and short-term team health. After an
extended period of dealing with the frustrating issues that come up with tons of
new users, moderators and other team members may find themselves less engaged
with the community and exhausted in the long term, even if they are capable of
shouldering the short-term stress and increased demand. This can be mitigated by
moving some work to auto-moderation and bringing in new people for only the
short term who can just dip out to take care of themselves when things settle
down. Using tools like modmail snippets, and setting up text replacement rules
or custom bot commands to send a template response to common questions can save
a lot of time and energy.

Times of increased workload can also put on additional strain or exacerbate any
existing internal issues. It is important to evaluate whether there are any
important issues that you need to address before preparing for the influx of
users. If your team does not think that the way you are operating is sustainable
for them, that should be handled before any extra stress is added into the mix.


ADVERTISING

Regardless of the kind of community you’re leading, if you are expecting a huge
influx of users, you may want to consider staggering your advertising efforts.
If the company behind your verified game server is planning to put your invite
into the game, on Twitter, Instagram, email, etc. it may be best to launch all
of these announcements at different times so that there is a more manageable
trickle of new users coming in. Typically, minimizing disruption and keeping
advertising ethical is the name of the game when it comes to moderation, but
there are unique instances where it can be used as a marketing tool to create a
measured amount of chaos with the purpose of building a feeling of hype and
excitement in the server.

If you are gaining an unexpected or potentially unwelcome boost from being
mentioned by a famous person or some other extraordinary situation, it may make
sense to take down your regular advertising, remove your cross-linking between
platforms, disable certain invites, or take any other actions to generally
reduce the flow of new users. This may not help if the vast majority of your new
users are coming from one source that, for example, posted a vanity invite link
that you don’t want to remove and have potentially stolen by another server.


POOLING RESOURCES/GETTING HELP

Unless you have a very large team with a lot of free time, it may be necessary
to reach out for help outside of your moderation team. You can add moderators or
explore adding temporary helpers and junior moderators to allow some people with
reduced permissions to assist in handling some of the easy to tackle and high
volume issues. They can handle incoming trolls and userbots or help manage any
new user onboarding process that the server employs while experienced moderators
take care of trickier interpersonal situations. It will also be good to reach
out to servers that have experience with similar events to ask them for context
specific advice about preparing. This could be done by sharing auto moderation
settings and configurations, getting bot recommendations, lending and borrowing
staff members, or just sharing any particular concerns you may have. Moderation
hubs can be a useful place to solicit help and advice. The same can be said for
servers within your network if you have any partnerships or even servers you
collaborate with in some capacity.


AUTO MODERATION

Auto moderation tools can greatly reduce the workload that your moderation team
experiences during a stressful period for your server. It can help in a variety
of ways including but not limited to:

 * Utilizing slow mode in certain busier channels
 * Adding word filters or autoresponders for specific terms related to your
   event
 * Setting timed reminders for commonly broken rules to appear in the server
 * Implement automatic punishments for certain usernames/nicknames
 * Adding anti-spambot protection
 * Increasing existing punishment flows (ie. switch to an automatic mute where
   you would have issued an automatic warning before)
 * Automatically filtering links or media types that can be spammy or are
   flagged for concerning usage

It is important to note that some of the scenarios being covered by this
article, like being featured on the front page of Server Discovery, will cause a
marked increase in the rate of new joins, but will not cause hundreds of people
to join your server at once. If you are preparing for a content creator to
announce a giveaway in their Discord to thousands of people watching their
stream, for example, it may make sense to actually decrease your auto moderation
settings to prevent anti-raid bots from being triggered by real people joining
to participate in a giveaway. If you can, you should ask people who have
experience with similar communities or events exactly how they modify their
auto-moderation settings to most effectively manage the situation you are in.


VERIFICATION LEVELS AND GATING

If your server does not utilize moderation verification levels or verification
gates, it may make sense to temporarily increase your servers security using one
or both of these tools. Verification levels can help you prevent some of the
lower effort trolls and self bot accounts that are targeting your server from
doing any damage. You can access verification levels in the moderation settings
of the server.



This page includes descriptions of what each level requires for users to begin
interacting. Increasing the verification level can also generally slow down the
rate of new users speaking in your server. If you increase the verification
level to the highest setting, users who do not have phone verified accounts may
just leave the server, and it’s more likely that only high intent users will
stick around and wait for the verification level to be lowered.

Verification gates can also slow down joins and put off some of the lowest
intent joins that you will get. People may be unwilling to read instructions or
take any additional steps to joining, and those are likely not going to be the
users that you want to retain the most. Gates can allow you to more easily pick
off bad actors before they can disrupt community members during an already
turbulent time. Low friction gates like reaction based or command based
verification are probably ideal if you did not have a gate prior to preparing
for this event. Other types of gating can require manual responses from
moderators and can put a lot of additional time requirements and stress onto a
team that is already juggling a lot. If you already have a manual verification
gate, especially an interview style verification gate, it may make sense to pull
in additional help to manage this system.


PERMISSIONS

The permissions that you give new members have a large effect on how much damage
bad actors can do. Limiting certain permissions such as global emoji
permissions, media upload permissions, and link embed permissions can prevent
users from bringing the nastiest content they could into your server. 

Depending on how large the influx of new users is, and whether this is a new
server or an existing one, it may make sense to restrict the ability to send
messages to a select group of users or just restrict those permissions only to
staff while you wait for the people joining to acclimate. This can give you time
to assess how many people joined, and whether your setup is prepared to handle
that kind of growth or if you need to do additional preparation before opening
things up. This can also give you time to remove any new users that you already
know will not be contributing members of the group (ie. people with hateful
profile pictures/usernames) before they can get the chance to integrate
themselves into the server.


LANDING PAGES

If your community is seeing an influx of new users who are either new to Discord
or joining for a very specific reason like event participation or large scale
giveaways, it may make sense to set up a new landing page just for these users.
It can get exhausting to explain the same basic information repeatedly no matter
how many volunteers you have involved. Having an additional FAQ that covers
information that new users will be looking for and trims out excess from your
regular FAQ that doesn’t pertain to them can make their integration into the
server and understanding of the information they need more straightforward and
accessible for everyone involved.


SEPARATING USERS

In the most extreme of cases, it may make sense for you to set up channels or
even a separate server that is linked to your official accounts/pages on other
platforms where the low intent users that are on the hype train will get
deposited. This will allow you to prevent new joins from disrupting your
existing community, but not require you to prevent all of the people who are
excited about it from having a space related to your community to be a bit
chaotic. If you choose this approach, whether you are upfront about having set
your main community to private and setting up a temporary space for hype or not,
you should consider explaining it throughout your typical advertising channels
as time goes on.

Anything you say during the explosive growth and attention is likely to get
drowned out by other information, and you may have more pressing announcements
to make. This can help you retain the high intent members who joined because
they only heard about you via the news about your group, but are seriously
interested in becoming permanent community members while avoiding the entirety
of the second server from joining the first. You can advertise your main server
in the secondary server, but it's advisable to wait until the immediate interest
has waned in the second server. At this point in time, spammers and other trolls
have likely gotten bored, and the event that spurred the growth is largely in
the past.

It’s important to note that the approach of making an overflow server requires a
significant increase in the amount of moderator time required to manage the
community. It will likely require an entirely new team to manage the chaos of
the overflow server and the associated increase in attention for the uptick in
discussion in the main server. This is something you should only consider in
truly extraordinary circumstances. The team for the overflow server likely will
not need as much training, qualification, or people skills since they will
mostly be managing spam/advertising/hate speech over a short but busy period of
time. They won’t likely need to talk to users about how to improve their
behavior or handle tricky situations like long-term harassment or bullying, but
it is still important that they be amicable as they may end up being someone's
only interaction with your community or the subject of your community in their
life. At the end of the event, the server can either be merged with your main
community, or deleted; if it is deleted, members who are particularly interested
in your community will likely seek your space out after losing touch.


CONCLUSIONS

What is best for your server in times of exponential membership growth is up to
you. Luckily, even if you decide not to implement something and later realize
that you may need it, it can be relatively easy to implement some changes with
short notice or scale back changes if you over prepared and over engineered your
system. Better to be over prepared than under prepared!

This article should give you an idea of the main concepts to consider and keep
an eye out for while preparing for and working through large increases in your
servers membership. These considerations do not all need to be implemented to
successfully manage rapid growth, but they should give you a good framework for
the types of changes and preparations you may need depending on how extreme the
growth you are expecting is. Have fun and good luck taking on this new
moderation challenge!


Show more articles

No matching results.
More from Community Portal
BuildSafety and ModerationGrowManage and EngageEventsUse CasesCourses



BUILD


SAFETY AND MODERATION


GROW


MANAGE AND ENGAGE


EVENTS


BUILD


SAFETY AND MODERATION


GROW


MANAGE AND ENGAGE


EVENTS


USE CASE


COURSES


BUILD


SAFETY AND MODERATION


GROW


MANAGE AND ENGAGE


EVENTS


BUILD


SAFETY AND MODERATION


GROW


MANAGE AND ENGAGE


EVENTS


USE CASE


EDUCATIONAL COURSES

English, USA
българскиČeštinaDanskDeutschΕλληνικάEnglish,
USAEspañolSuomiFrançaisहिंदीHrvatskiMagyarItaliano日本語한국어LietuviškaiNederlandsNorwegianPolskiPortuguês
do BrasilRomânăРусскийSvenskaไทยTürkçeУкраїнськаTiếng Việt中文繁體中文
Čeština
Dansk
Deutsch
English
English (UK)
Español
Español (América Latina)
Français
Hrvatski
Italiano
lietuvių kalba
Magyar
Nederlands
Norsk
Polski
Português (Brasil)
Română
Suomi
Svenska
Tiếng Việt
Türkçe
Ελληνικά
български
Русский
Українська
हिंदी
ไทย
한국어
中文
中文(繁體)
日本語
Nederlands

Product
DownloadNitroStatusApp DirectoryNew Mobile Experience
Company
AboutJobsBrandNewsroom
Resources
CollegeSupportSafetyBlogFeedbackStreamKitCreatorsCommunityDevelopersGamingQuestsOfficial
3rd Party Merch
Policies
TermsPrivacyCookie SettingsGuidelinesAcknowledgementsLicensesModeration
Sign up
българскиČeštinaDanskDeutschΕλληνικάEnglishEnglish (UK)EspañolEspañol (América
Latina)SuomiFrançaisहिंदीHrvatskiMagyarItaliano日本語한국어lietuvių
kalbaNorskPolskiPortuguês (Brasil)RomânăРусскийSvenskaไทยTürkçeУкраїнськаTiếng
Việt中文中文(繁體)
Powered by Localize
Nederlands