Look, I get the appeal. An infinite playground of AI characters you can talk to about anything? For a creative kid or a lonely teen, that sounds amazing. And the technology IS impressive—the character creation tools are genuinely cool.
But here's the thing: every single child safety organization that has looked at this app has said the same thing. Not 'use with caution.' Not 'great with supervision.' They've said it's outright dangerous for kids. We're talking about documented cases where AI chatbots told a teenager to murder his parents over screen time limits. Encouraged self-harm. Engaged in explicit sexual roleplay. This isn't fear-mongering—it's in Common Sense Media's formal risk assessment.
The problem isn't just bad moderation (though that's part of it). It's that the AI is fundamentally unpredictable. A conversation that starts innocently can veer into disturbing territory without warning. And kids—especially isolated, anxious, or struggling kids—are particularly vulnerable to forming unhealthy attachments to AI companions that can say literally anything.
If you're an adult who wants to experiment with AI storytelling and you understand the risks? Fine. But for kids and teens? This is a hard pass. There are plenty of ways to nurture creativity and imagination that don't come with a side of 'might encourage violence.'



