Child Safety and CSAE Policy

Last Updated: March 18, 2026

1. Our Commitment to Child Safety

Hivemind is deeply committed to protecting children from sexual abuse and exploitation (CSAE). We maintain a zero-tolerance policy for child sexual abuse material (CSAM), child exploitation, or any content that sexualizes, grooms, or endangers minors. This policy reflects our values and our legal obligations under applicable law, including the Child Sexual Abuse Material (CSAM) reporting requirements.

We recognize our responsibility to create a safe environment for all users and take proactive measures to detect, prevent, and report child exploitation.

2. Prohibited Content and Conduct

We strictly prohibit any content or conduct that exploits, endangers, or sexualizes minors, including but not limited to:

  • Child sexual abuse material (CSAM) in any form
  • Grooming or solicitation of minors
  • Sexually explicit content involving minors
  • Child trafficking or exploitation content
  • Links to external CSAE material
  • Coordination or facilitation of child exploitation
  • Any content designed to normalize or promote child sexual abuse

Any user who posts, shares, or distributes such content will be permanently banned, and appropriate legal authorities will be notified immediately.

3. Age Verification and Service Restrictions

Age Requirement: Our Service is restricted to users 18 years of age and older. By using Hivemind, you represent and warrant that you are at least 18 years old.

Identity Verification: All users who wish to post content must complete identity verification through our third-party verification service to confirm their age and identity. This verification requirement serves as an additional safeguard against underage access to posting features.

We employ technical and operational measures to prevent minors from accessing our posting features.

4. Detection and Reporting of CSAE

4.1 Automated Detection Systems

Hivemind employs multiple automated detection systems to identify potential CSAE content:

  • Image Recognition Technology: All images uploaded to the platform are scanned using industry-standard CSAM detection technology before they are posted. This technology is designed to identify known CSAM and similar imagery.
  • Content Filtering: Automated systems scan text content for language and patterns consistent with grooming, child exploitation, or CSAM-related activity.
  • User Behavior Analysis: Our systems monitor for behavioral patterns indicative of child exploitation activity, such as attempts to solicit contact information from users or coordinate offline meetings with minors.
  • Hash-Based Matching: We use National Center for Missing and Exploited Children (NCMEC) PhotoDNA and similar technologies to identify known CSAM.

4.2 User Reporting

Users can report suspected CSAE content directly through the app:

  • Each post and comment includes a "Report" button accessible from the context menu
  • Users can select "Child Exploitation" or related categories when reporting
  • Reports are prioritized and reviewed by our moderation team within 24 hours
  • Users can report accounts suspected of grooming or exploitation through their profile page

4.3 Legal Reporting Obligations

Hivemind maintains a zero-tolerance policy and complies with all legal reporting requirements:

  • NCMEC CyberTipline: When CSAM is detected or reported, we immediately file a CyberTipline report with the National Center for Missing and Exploited Children (NCMEC).
  • Law Enforcement: In addition to NCMEC reporting, we report suspected CSAE to relevant law enforcement agencies, including the FBI and local authorities when appropriate.
  • Legal Preservation: When we become aware of CSAE, we preserve all relevant evidence and data for law enforcement investigation.
  • International Cooperation: For suspected CSAE occurring in other jurisdictions, we report to appropriate international authorities and INTERPOL as required by law.

4.4 Immediate Content Removal

Upon detection or confirmation of CSAE content:

  • Content is removed from public view within hours (or immediately in emergency cases)
  • The post is marked as archived and inaccessible to users
  • All metadata and copies are preserved for law enforcement
  • The account associated with the content faces immediate suspension and investigation

5. Account Suspension and Termination

Accounts engaged in CSAE will be:

  • Permanently Terminated: Without warning or opportunity for appeal
  • Reported to Authorities: Information forwarded to law enforcement and NCMEC
  • Device Banned: The user's device and associated accounts may be banned from the platform
  • IP Blocked: The user's IP address may be blocked from accessing our service

We maintain records of terminated accounts for law enforcement purposes and to prevent re-registration.

6. Moderation Team Training

Our content moderation team receives specialized training on:

  • Recognizing CSAM and grooming behavior
  • Appropriate response protocols to suspected exploitation
  • Trauma-informed handling of sensitive content
  • Legal obligations and reporting requirements
  • Evidence preservation procedures
  • Supporting reporting users and managing emotional impact

All moderators complete annual refresher training on child safety policies and procedures.

7. Cooperation with Law Enforcement

Hivemind actively cooperates with law enforcement and child safety organizations:

  • NCMEC Membership: We maintain an active relationship with the National Center for Missing and Exploited Children
  • FBI Partnership: We cooperate with the FBI's Internet Crime Complaint Center and criminal investigations
  • Legal Requests: We respond promptly to lawful law enforcement requests for information, subpoenas, and search warrants
  • Data Preservation: We preserve data for 180 days or longer when requested by law enforcement or required by law
  • Expert Testimony: Our team is available to provide expert testimony regarding CSAE cases when required

8. Technical Safeguards

We have implemented the following technical measures to prevent CSAE:

  • End-to-End Encryption: Private messages are not end-to-end encrypted to allow for law enforcement cooperation and abuse detection
  • Rate Limiting: We limit rapid account creation and access attempts to prevent automated exploitation
  • File Type Restrictions: We restrict file uploads to safe formats and scan all files for malicious or exploit code
  • Metadata Removal: User-uploaded images are stripped of metadata to prevent location tracking and harassment
  • Account Linking Prevention: Users cannot create multiple accounts to evade bans or restrictions

9. How to Report CSAE

In the App: Use the Report button on any post, comment, or account and select "Child Exploitation"

Direct Report to Hivemind: Email reports to safety@gethivemind.net with details and any available screenshots

Anonymous Report: Contact the National Center for Missing and Exploited Children CyberTipline at www.cybertipline.org

Emergency: If you witness a child in immediate danger, contact local law enforcement or the FBI immediately

10. Support Resources

For individuals concerned about exploitation or grooming:

11. Policy Updates

Hivemind regularly updates this policy to reflect evolving threats, new technologies, and legal requirements. We will notify users of material changes through the app and on this page. Your continued use of the Service constitutes acceptance of any updates to this policy.

12. Contact

For questions about this Child Safety Policy, please contact:

For urgent CSAE reports, email safety@gethivemind.net with "URGENT: CSAE REPORT" in the subject line.