dis.openai.one
Open in
urlscan Pro
154.12.35.206
Public Scan
Submitted URL: http://dis.openai.one/safety/360044159011-what-actions-we-take
Effective URL: https://dis.openai.one/safety/360044159011-what-actions-we-take
Submission: On July 22 via api from US — Scanned from CA
Effective URL: https://dis.openai.one/safety/360044159011-what-actions-we-take
Submission: On July 22 via api from US — Scanned from CA
Form analysis
0 forms found in the DOMText Content
DownloadNitroDiscoverSafety Safety SupportBlogCareers Download Back Safety Center Overview Controlling Your Experience Four steps to a super safe accountFour steps to a super safe serverRole of administrators and moderators on DiscordReporting problems to DiscordMental health on DiscordAge-Restricted Content on DiscordTips against spam and hacking Parents & Educators What is Discord?Discord's commitment to a safe and trusted experienceHelping your teen stay safe on DiscordTalking about online safety with your teenAnswering parents' and educators' top questionsIf your teen encounters an issueWorking with CARU to protect users on Discord How We Enforce Rules Our policiesHow we investigateWhat actions we takeHow you can appeal our actionsDiscord's Transparency ReportsWorking with law enforcement Back Moderator Academy Overview Basics 100: An Intro to the DMA103: Basic Channel Setup104: How To Report Content To Discord110: Moderator Etiquette111: Your Responsibilities as a Moderator151: An Intro to the Moderator Ecosystem Setup and Function 201: Permissions on Discord202: Handling Difficult Scenarios203: Developing Server Rules204: Ban Appeals205: Utilizing Role Colors206: Best Practices for Reporting Tools207: Server Information and Announcement Channels208: Channel Categories and Names210: Moderator Recruitment211: Creating Moderation Team Channels231: Fundamentals of Family-Friendly Servers241: Securing Your Discord Account Advanced Community Management 301: Implementing Verification Gates302: Developing Moderator Guidelines303: Facilitating Positive Environments304: Moderating Safely and Securely310: Managing Moderation Teams311: Understanding and Avoiding Moderator Burnout312: Internal Conflict Resolution313: How to Moderate Voice Channels314: Training and Onboarding New Moderators321: Auto Moderation in Discord322: Using Webhooks and Embeds 323: Using XP Systems324: Using Modmail Bots331: Community Engagement332: Fostering Healthy Communities333: Planning Community Events334: Community Partnerships341: Understanding Your Community Through Insights345: Best Practices for Moderating Content Creation Moderation Seminars 401: Transparency in Moderation402: Confidentiality in Moderation403: Sensitive Topics404: Considering Mental Health in Your Community 405: Practicalities of Moderating Adult Channels407: Managing Exponential Membership Growth431: Ethical Community Growth432: Internationalization of a Community441: Community Governance Structures442: Using Insights to Improve Community Growth and Engagement443: Ban Evasion and Advanced Harassment444: Managing Interpersonal Relationships451: Reddit X Discord452: Twitch X Discord453: Patreon X Discord455: Schools X Discord459: Bringing Other Communities to Discord Graduate 531: Parasocial Relationships541: The Application of Metaphors in Moderation Author Credits Author Credits Login Discord Safety CenterPolicy HubSafety LibraryWhat actions we take Discord Version No items found. May 12, 2022 How We Enforce Rules WHAT ACTIONS WE TAKE When our Trust & Safety team confirms that there has been a violation of our Community Guidelines, the team takes immediate steps to mitigate the harm and, wherever possible, help users avoid breaking the rules in the future. The following are actions that we might take on either users and/or servers: * Removing the violating content * Temporarily limiting access to posting or other Discord features * Warning users about what rule they broke * Temporarily suspending users as a “cool-down” period * Disabling a server’s ability to invite new users * Removing a server from Discord * Permanently suspending a user from Discord due to severe or repeated violations Discord also works with law enforcement agencies in cases of immediate danger and/or self-harm. We swiftly report child abuse material content and the users responsible to the National Center for Missing and Exploited Children. Tags: Reporting Transparency User Safety Contents SAFETY CENTER Explore more Controlling Your Experience Four steps to a super safe account LOREM IPSUM IS SIMPLY AN UPDATE ON OUR MENTAL HEALTH & BELONGING WORK Parents & Educators WELCOME TO POLICY AT DISCORD IMPORTANT POLICY UPDATES English, USA българскиČeštinaDanskDeutschΕλληνικάEnglish, USAEspañolSuomiFrançaisहिंदीHrvatskiMagyarItaliano日本語한국어LietuviškaiNederlandsNorwegianPolskiPortuguês do BrasilRomânăРусскийSvenskaไทยTürkçeУкраїнськаTiếng Việt中文繁體中文 English Čeština Dansk Deutsch English English (UK) Español Español (América Latina) Français Hrvatski Italiano lietuvių kalba Magyar Nederlands Norsk Polski Português (Brasil) Română Suomi Svenska Tiếng Việt Türkçe Ελληνικά български Русский Українська हिंदी ไทย 한국어 中文 中文(繁體) 日本語 Product DownloadNitroStatusApp DirectoryNew Mobile Experience Company AboutJobsBrandNewsroom Resources CollegeSupportSafetyBlogFeedbackStreamKitCreatorsCommunityDevelopersGamingQuestsOfficial 3rd Party Merch Policies TermsPrivacyCookie SettingsGuidelinesAcknowledgementsLicensesCompany Information Sign up