Child Safety Standards
Our commitment
Agyata has zero tolerance for child sexual abuse and exploitation (CSAE), including child sexual abuse material (CSAM), grooming, solicitation of minors, and any content that sexualises or endangers anyone under 18. This page sets out the standards we hold ourselves to, the technical controls that enforce them, and how to reach us when something is wrong. We publish this document because Google Play requires it of every app in our category; we also believe that operating a random-chat product places a heightened duty of care on the operator, and that the best way to honour that duty is to be explicit about what we do.
Agyata is 18+ only
Agyata is strictly for adults. On first visit, the session is blocked by an 18+ age gate that the user must affirmatively confirm before any feature is usable. Age confirmation is recorded with a hashed IP and timestamp as compliance evidence. We do not have, and never plan to have, a separate experience for minors. If we learn a user is under 18, we terminate that session, remove any content they posted, and add a ban. If you believe a user on Agyata is under 18, email safety@agyata.com immediately — that address is monitored 24 hours a day.
Prevention — how we make CSAE less likely
No accounts, no profiles, no avatars: there is no persistent identity for a minor to impersonate or a predator to cultivate. No file uploads anywhere in the app; there is no surface to share images, videos, or documents outside a live video call. No contact list, address book, or location sharing. No direct-message inbox outside an active random pairing; chats exist only while both parties are live in a room, and evaporate when either leaves. Strangers cannot select a target — pairings are random and mode-locked, and the matchmaker DO does not accept preference parameters. These are product choices, not settings.
Detection — how we find CSAE content and behaviour
Every confession post is screened before publication by Meta LlamaGuard 2 plus a rule-based classifier that flags self-harm, sexual content involving minors, grooming-style language, and solicitation patterns. Posts matching hard rules are rejected; borderline signals soft-hide for human review. Every text-chat message is screened the same way in-room; three soft-hides in a session force-end the room and auto-file a report. Video chat is peer-to-peer, so real-time CSAM screening happens on the device: when a user reports a frame, the browser computes a perceptual hash (PHash) of the last 4 seconds of the remote video locally and sends only those hashes to our servers. The hashes are matched against a CSAM hash database; a match triggers the P0 response flow below. Cloudflare CSAM Scanning is enabled zone-wide as a further safety net on any cached content.
Response — what happens when CSAE is detected
Verified CSAM matches trigger a P0 incident: the originating session is banned permanently across session id and hashed IP; device fingerprint evasion is tracked via the ban registry (documented in docs/RUNBOOK.md §2). Evidence (the PHash match set, not any media) is preserved in a dedicated R2 bucket. Within 5 minutes, ops receives an email alert at alerts@agyata.com and a phone page via BetterStack. Within 24 hours, a CyberTipline report is filed with NCMEC via Cloudflare's built-in integration; for non-US incidents the equivalent national authority is notified (IWF in the UK, the Indian Cyber Crime Portal / POCSO channel for India). Content targets are removed immediately; session bans are immediate and non-appealable. We cooperate fully with law-enforcement preservation requests.
In-app reporting — how users report CSAE
Every user-to-user interaction has a report control reachable in under two taps. On confession cards the report button is on each post. On text chat there is a flag icon in the controls bar above the composer. On video chat there is a flag icon inside the floating controls pill, available at any point during the call — including when the remote video is on and the user has one second to react. The report form accepts a reason (harassment, self-harm, sexual, minors, spam, other) and an optional free-text note; for video reports the browser automatically attaches PHashes of the last 4 seconds of the remote feed. We read every report within 24 hours; minors-related reports are escalated immediately regardless of queue depth.
Compliance with law
United States: we report verified CSAM to the National Center for Missing & Exploited Children (NCMEC) via the CyberTipline, in accordance with 18 U.S.C. §2258A. United Kingdom: we accept reports from and cooperate with the Internet Watch Foundation (IWF). European Union: we comply with the Digital Services Act, including the CSAM-related articles and the independent audit obligations when applicable to our size. India (our operating jurisdiction): we comply with the Protection of Children from Sexual Offences Act, 2012 (POCSO), the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, and related regulations; we maintain a designated Grievance Officer whose contact details are on our Contact page at /contact. Where national law requires it, we preserve data relevant to a reported incident until the applicable investigation concludes.
Contact for law enforcement and child safety organisations
Law-enforcement preservation requests, subpoenas, and child-safety organisation inquiries: le@agyata.com (monitored; 24-hour response target). Legal escalation: legal@agyata.com. User reports of suspected child-safety violations on Agyata: safety@agyata.com. Our designated child-safety contact (also our Grievance Officer under the Indian IT Rules, 2021): Agyata Technologies Pvt Ltd, Telangana, India. We do not charge any fee for law-enforcement requests relating to child safety.
Technical controls referenced on this page
The specific implementations described above live in the following parts of our codebase and runbook: (a) 18+ age gate — services/api-gateway /session/age; (b) LlamaGuard-based text moderation — services/moderation-svc; (c) frame-PHash reporter — apps/web/src/phash.ts and the /moderate/frame-hash endpoint; (d) session + IP ban registry — services/api-gateway/src/bans.ts; (e) P0 CSAM response flow — docs/RUNBOOK.md §2. Our internal code remains internal; these references exist so a Play reviewer can verify that the controls named above correspond to actual shipping code, not to marketing claims.
Changes to these standards
These standards change when the underlying product or law changes — never in isolation. Material changes are versioned at the top of this page and announced to all active sessions via a site-wide banner for at least seven days before taking effect. Historical versions of this document are available by emailing safety@agyata.com.
Contact
Child-safety concerns: safety@agyata.com. Law-enforcement: le@agyata.com. Legal: legal@agyata.com. General: hello@agyata.com.