Moderation Policy

Grumle uses moderation to protect people and the space.

It is not about control or censorship.

How moderation works

Content is reviewed using:

  • AI-assisted checks
  • human judgement for edge cases
  • Humans make final decisions.

Moderation is about safety and calm. AI helps with consistency, but final decisions are always human.

Possible outcomes

Allow

The content is published.

Hold

The content is paused for review or clarification.

This is not a penalty.

Block

The content is not published because it clearly breaks the rules.

This is rare and non-negotiable.

What moderation looks for

Moderation focuses on:

  • privacy breaches
  • harm to others
  • safety concerns
  • harassment or intimidation

Strong emotions are allowed.

Attacks on people are not.

Moderation also looks for attempts to move conversations off the platform, share unsafe links, or organise external contact.

Crisis and safety

Grumle is not a crisis service.

If content suggests immediate danger, it may be paused and support resources shared.

This is done for care, not discipline.

Transparency

Rules are public.

Decisions are explainable.

Appeals are possible.

The aim is fairness and safety, not perfection.