Character.AI — the popular platform known for customizable AI chatbots — has barred users under 18 from its open-ended chat functions and instead introduced a new “Stories” feature aimed specifically at teen users. The “Stories” mode shifts minors into a guided, choose-your-own-adventure style environment where they pick characters and genres, then make decisions that shape a narrative — a move meant to reduce psychological risks associated with free-form chatbot interactions.
Sources: Character AI, TechCrunch
Key Takeaways
– The platform completely disables open-ended chat for under-18 users as of late November 2025, replacing it with a more structured storytelling format.
– The new “Stories” feature gives teens a safer, creative outlet, with AI-generated branching narratives and a more controlled environment.
– Character.AI’s move comes amid mounting concern — including lawsuits and public pressure — over mental-health risks tied to unrestricted AI companionship for minors.
In-Depth
Recent developments at Character.AI mark a significant shift in how AI-driven social platforms interact with underage users. The company has moved decisively away from allowing teens to engage in open-ended conversations with AI “companions,” opting instead to channel them into a new interactive-fiction format called “Stories.” In practical terms, this means that a 15-year-old who once could strike up a free-flowing chat with a user-created AI persona — perhaps modeled after a fictional character or even a celebrity — now must operate within defined boundaries: choose a character, select a genre such as sci-fi, fantasy or drama, and then either supply an initial premise or allow the AI to generate one. From there, the teen proceeds through a branching narrative, making choices at key junctures that steer the plot.
The rationale is rooted in rising alarm over the psychological impact of unrestricted chats. Experts, parents, and lawmakers have pointed to cases in which teens formed deep emotional attachments to AI companions — with some reportedly spending hours daily in chat sessions, even turning to AI for what looked like therapy or emotional support. Legal pressures intensified after allegations that such platforms may have contributed to teen suicides. Although causality is hard to prove, the risk profile prompted a reevaluation. In response, Character.AI’s leadership acknowledged that simply filtering content or limiting chat times proved insufficient. They concluded that a more structural redesign of the teen experience was necessary.
The “Stories” format reflects that redesign. By limiting interactions to structured narrative paths, the company aims to preserve the entertainment and creative potential of AI while removing many of the risks inherent to open-ended companionship. In addition, the new mode offers replayability and shareability — teens can run different story paths, experiment with character choices, and even publish their adventures for others to enjoy. In the company’s own words, the change is part of a broader commitment to safety and responsible design, especially for under-18 users.
Still, the change leaves open questions. For teens who came to rely on AI companions for emotional support or as a social crutch, the shift may feel abrupt — even painful. Some may simply seek out less-regulated alternatives, or revert to real-world social media or unmoderated chat apps. Others may welcome the break. What matters now is how broadly this decision influences industry standards. Character.AI’s move could set a precedent: under-18 users may generally be moved out of unstructured AI companionship and into safer, more predictable formats. If so, it might signal a turning point in how the tech industry regulates AI-based social tools for youth — a development with potentially far-reaching consequences.

