Child Sexual Abuse & Exploitation (CSAE) Safety Standards

Roro Momo — Child Sexual Abuse & Exploitation (CSAE) Safety Standards

Last updated: November 30, 2025

1. Purpose

Roro Momo is committed to protecting children and young people from sexual abuse and exploitation. These standards explain the measures we take to prevent CSAE, how we respond to reports, and how we cooperate with law enforcement and child-protection organizations.

2. Scope

This policy applies to:

  • All user-generated content (images, videos, memes, text, private messages, profiles).

  • All users, contractors, moderators, and third-party services used by Roro Momo.

  • All interactive features (comments, direct messages, live-streams).

3. Zero-tolerance principle

Roro Momo has zero tolerance for content that sexualizes, exploits, or abuses children or young persons (anyone under the age of 18). Such content will be removed immediately and may be referred to law enforcement.

4. Definitions

  • Child / Young Person: anyone under 18 years old.

  • CSAE content: any depiction, request, or facilitation of sexual activity involving a child, or sexualized images of a child, or material that groom, exploit, or traffic minors.

  • CSAM: Child Sexual Abuse Material — images, videos or audio depicting sexual abuse of a child.

5. Proactive prevention measures

We implement the following technical and operational measures:

  • Content filters and machine-learning models to flag probable CSAE content for human review.

  • Known-hash detection: we check uploaded media against databases of known CSAE/harmful content hashes (where available) to block and escalate matches.

  • Upload restrictions: image/video uploads in certain areas are restricted or must pass automated checks before publishing.

  • Age-gating: features that could expose minors to adult content are blocked or require stricter verification.

  • Rate-limits and behavioral signals to detect grooming patterns (excessive messaging, friendship requests to minors, etc.) for review.

6. Reporting, review, and takedown workflow

  1. User reporting: Any user can report content or accounts via the in-app “Report” button or by emailing: info@roromomo.com.

  2. Automated flags: Content flagged automatically is placed into a priority review queue.

  3. Human review: Trained moderators review all CSAE reports and automated matches 24/7 where possible, following clear triage guidelines.

  4. Immediate removal: Content that meets our CSAE criteria is removed immediately from public view pending further action.

  5. Account actions: Accounts involved are suspended or terminated depending on severity (temporary suspension, permanent ban).

  6. Preserve evidence: When removing content, we preserve metadata and non-public copies as required for lawful requests/investigations.

  7. Law enforcement referral: If content suggests a child is being exploited, we promptly refer the case to the appropriate law enforcement agency and, where applicable, to recognized hotlines/child-protection organizations.

  8. Notification: Where legally permitted, we notify the reporting user about actions taken.

7. Cooperation with authorities & hotlines

We cooperate with law enforcement and child protection NGOs. Where applicable, we will:

  • Respond to lawful requests and emergency preservation requests.

  • Refer suspected CSAM to specialized reporting centers and hotlines (for example: national child-protection hotlines; in the U.S., reports can be referred to NCMEC).

  • Work with international bodies and trusted third-party networks for cross-border cases.

8. Data retention & evidence handling

  • We retain removed content and relevant metadata for a limited period to support investigations and lawful requests. Retention follows legal requirements and our internal retention schedule.

  • Access to preserved data is logged and restricted to authorized personnel only.

9. Moderator training & safety staffing

  • Moderators undergo regular training on CSAE recognition, trauma-informed review, legal obligations, and safe-handling of sensitive material.

  • Moderators are given support (counseling resources) because reviewing CSAE content can be traumatic.

10. User safety features & education

  • Clear reporting tools on every post/profile.

  • Safety tips and guidance for parents and young users, including how to report and block users.

  • Age-appropriate design and moderation for spaces likely to attract minors.

11. Appeals & transparency

  • Users can appeal moderation decisions through our appeals process contact info@roromomo.com.

  • We publish regular transparency reports summarizing takedowns, law enforcement requests, and safety improvements (frequency: biannually).

12. Third-party partners and providers

  • We require third-party vendors and integrations (CDNs, moderation tools, analytics) to comply with our CSAE policy and applicable law.

  • Contracts require rapid cooperation in investigations and data preservation.

13. Legal compliance

We comply with applicable laws related to CSAE, mandated reporting, and data preservation. Where local law requires different handling, we follow the stricter standard that best protects children.

14. Contact & reporting

  • In-app reporting: use the Report button on content or profiles.

  • Email (safety team): info@roromomo.com

  • Emergency / law enforcement matters: contact your local authorities and then email us the case/reference number so we can assist.

  • Publishing & transparency inquiries: info@roromomo.com

15. How to report suspected CSAE to us (quick)

  1. Click Report on the content/profile.

  2. Provide the post URL, user handle, and a short description.

  3. If it’s an immediate danger to a child, contact local law enforcement first, then notify us.

16. Review & update

We review this policy at least annually or when regulations change. Date of last update is shown at top of this page.