Skip to content

Commit 6080a70

Browse files
committed
chat: moderation introduction
1 parent fae3551 commit 6080a70

File tree

1 file changed

+21
-1
lines changed

1 file changed

+21
-1
lines changed

content/chat/rooms/moderation/index.textile

+21-1
Original file line numberDiff line numberDiff line change
@@ -9,4 +9,24 @@ languages:
99
- kotlin
1010
---
1111

12-
Moderation content to go here.
12+
Moderation is a crucial feature for chat rooms and online communities to maintain a safe, respectful, and engaging environment for all participants. Moderators help enforce community guidelines and remove potentially harmful content that can drive users away from an online experience.
13+
14+
Moderation strategies can take many forms. Human moderators can sit and participate in the chat room, using community guidelines to make judgements on the chat content, taking action such as deleting a message when it is found to be in violation of standards. Many modern approaches involve moderation engines and Artificial Intelligence models, which can screen content in order to filter out harmful messages before they are allowed into the chat room, without the need for human moderators. Many of these are highly configurable, allowing you to screen content across multiple categories to suit your needs. Hybrid approaches can make the best of both worlds, employing AI to pre-screen messages, with human moderators able to make judgement calls on edge-cases or in response to user feedback.
15+
16+
Ably Chat supports a variety of moderation options in chat rooms, to help you keep your participants safe and engaged.
17+
18+
h2(#types). Types of moderation
19+
20+
Moderation with Ably falls into two flavours: before and after publish.
21+
22+
h3(#before-publish). Before publish
23+
24+
When using before publish moderation, a message is reviewed by an automated moderation engine (such as an AI model) before it is published to the chat room. This is helpful in sensitive scenarios where inappropriate content being visible in the chat room for even a second is unacceptable, for example, in schools.
25+
26+
This approach provides additional safety guarantees, but may come at the cost of a small amount of latency, as messages must be vetted prior to being published.
27+
28+
h3(#after-publish). After publish
29+
30+
When using after publish moderation, a message is published as normal, but is forwarded to a moderation engine after the fact. This allows you to avoid the latency penalty of vetting content prior to publish, at the expense of bad content being visible in the Chat room (at least briefly). Many automated moderation solutions are able to process and delete offending messages within a few seconds of publication.
31+
32+
Please note that message deletion is currently performed as a soft delete, meaning that your application will need to filter out deleted messages that it sees.

0 commit comments

Comments
 (0)