Chat for school leaders: Moderate your chat feature

There are multiple levels of moderation in place to provide for a safe and supportive environment in your chat feature. This article will describe the chat moderation area of the chat feature and show you how you can customize it to work the way you want it to work for your district or school. 

This is a Chat for School Leaders article helping district and school admins with everything they need to manage the chat setups for all Chat Managers and Chat Participants at their district or school level. School leaders are likely at either the District Admin or School Admin role. 

Not a school leader? Check out Chat for Parents & Guardians or Chat for Teachers for more personalized help!  

In this Article


Security and privacy in Finalsite Chat

At Finalsite, ensuring your privacy and security is our top priority. Robust privacy settings are in place ensuring that your community's data is protected with care and integrity. Please view our Privacy Policy to learn more and contact us at privacy@finalsite.com with any questions or concerns. We strive to maintain a safe chat environment for productive engagement between users and teachers.

HIPAA and health security

It is our policy that no protected health information (PHI) subject to the Health Information Portability and Accountability Act (HIPAA) be disclosed through our platform. For transmitting medical documents, we recommend the following:

  • utilize a HIPAA-compliant database to ensure the highest level of security.
  • request that parents and guardians submit information including any protected health information via secure email.

How content moderation works

In Finalsite Chat, moderation is possible as district and school administrators are given the ability to actively monitor. There is also a process of reporting that takes place. You can rest assured that immediate blocking filters are in place for a wide variety of inappropriate words and scenarios. 

  • Active Monitoring: Continuous oversight by district and school administrators.
  • Flagging and Blocking: We utilize filters and blocklists to manage different types of content:
    • Flagging for Review: Content flagged by our AI model for toxicity, threats, or inappropriate language is reviewed by administrators.
    • Immediate Blocking: Messages containing profanity are blocked before sending, with the sender notified of the violation.

Adjust Posting permissions

Want to change who can post messages in a certain chat room? Posting permissions allow you to do just that! Certain chat rooms can be set up to only have school moderators post. This is great for rooms that will post announcements.

Here's how to adjust Posting permissions in a chat room:

Step 1: Open up the chat room whose permissions you want to adjust.

Step 2: Hold briefly on the room label at the top of the chat room to open up your chat room settings.

Step 3: In the Settings section, tap on Posting permissions.

Step 4: By default, each chat room is set to Everyone. Tap the radio button next to School moderators only. This means that anyone with chat participant level will not be able to post. Anyone with chat manager, school admin, or district admin roles can post in chat room set to School moderators only. 

Quick Tutorial: Adjust Posting permissions

5 STEPS

1. Want to change who can post messages in a certain chat room? Here's how to manage Posting Permissions in your chat room settings.

Tap to open up the chat room
whose permissions you want to adjust.

Step 1 image

2. Hold down briefly on the room label at the top of the chat room to open up your chat room settings.

Step 2 image

3. In the Settings section, tap to open up Posting permissions.

Step 3 image

4. By default, each chat room is set to Everyone. Tap the radio button next to School moderators only.

This means that anyone with chat participant level will not be allowed to post. Any role higher than this will not be the only ones allowed to post.

Step 4 image

5. Now you can see it is set to School moderators only. Anytime you need to adjust this, return to your chat room settings.

Now you know how to limit who can post in a certain room by adjusting Posting permissions.

Step 5 image

** Best experienced in Full Screen (click the icon in the top right corner before you begin) **

https://www.iorad.com/player/2428411/Adjust-posting-permissions

The Message moderation tab

In the Chat feature within your Mobile Apps module, you will find a Message moderation tab to manage all of the items in chat communication that are reported, reviewed, and blocked. 

message moderation panel.png

There are three tabs within Message moderation: Reported, Reviewed, and Blocked. Here are the different actions that can be taken within each area. 

Reported Reviewed Blocked

When any communication is reported, this is where it will first display, along with a number representing how many items are ready to be managed.

reported in message moderation .png

Take the following actions: 

  • Assign a 24-hour chat room block.
  • Unreport an item if made by mistake or overturned.
  • Click the 3-dot menu to view other actions such as: 
    • 24-hour chat room block
    • 24-hour global block
    • Permanent chat room block
    • Permanent global block

reported actions.png

Filter your results

Within each of the 3 areas above, you can also filter and narrow down your results for quick and seamless moderation! 

  • Click into the All times field to view a calendar and select a range of time to repopulate the results below.  
  • Click All categories to filter by moderation category: Bullying or harassment, inappropriate language, discrimination or hate speech, threats of violence, sharing of personal information, or inappropriate content. 

What is moderated? 

There are 4 areas that are continually moderated:

  • Commercial spam: Spam content is flagged and will appear in the moderation panel where certain roles can decide to block or remove them.
  • Platform circumvention: When efforts are made to direct members off-platform by requesting personal identifiable information and sensitive data such as phone numbers, usernames, and/or payment details. These messages are flagged and appear in the moderation panel.
  • Toxicity filter: through the use of AI (Artificial Intelligence), it's possible to identify 6 different types of harm in messages: toxicity/profanity, obscenity, threats, insults, identity attacks, and sexually explicit material. 
  • Blocklists: lists of up to 10,000 words that can be automatically blocked by moderation and not allowed to appear. 

 

Was this article helpful?
0 out of 0 found this helpful

Comments

0 comments

Please Sign in to leave a comment if you don't see the comment box below.