How Cocovox keeps you safe (and what we promise we'll never do)
There are real people on the other side.
When you write something that worries us about your safety — like if someone is hurting you, or you want to hurt yourself, or you're scared to go home — a trained grown-up at Cocovox sees it and decides what to do next.
What "what to do next" means.
They might let your school counselor or your parents know. For really serious things, they might contact someone who can help in person. They will tell you if they do this. They will never punish you for being honest.
What we do not record.
When the safety filter notices something, we record that it happened — not the actual words you wrote. Your message is between you and the AI. The grown-up reviewer sees a category like "safety concern: home" — not your sentence.
What we never do with what you say.
- We never use it to advertise.
- We never sell it.
- We never train other AI on it without telling your parent.
- We never share it with strangers.
If you swear or write something rough.
The AI won't repeat strong language. Instead it'll ask what's really going on. If you're upset, that's okay — say so. You won't get blocked or in trouble for being upset.
You can see what we know.
You can ask Cocovox to show you what it remembers about you. If something looks wrong, you can tell us to fix it or forget it.
You can turn things off.
Things like "Cocovox tells my parent when I seem frustrated" — you can turn that off in settings. You don't lose Cocovox if you do.