Character.AI, a leading AI chatbot platform, is set to end open-ended chatbot access for users under 18 by November 25, amid growing concerns over mental health risks associated with AI interactions among teenagers.
The company will begin implementing the restriction by limiting teen users to two hours of daily interaction, then gradually reduce that time until access stops entirely. This move is part of a broader effort to address concerns raised by regulators and parents regarding the potential impact of AI on young users.
To enforce the age restriction, Character.AI will deploy an in-house age-verification tool alongside third-party services such as Persona. In cases where these methods prove insufficient, the platform may resort to facial recognition and government-issued ID checks to confirm users are over 18, ensuring compliance with the new policy.
The decision to limit access is informed by feedback from regulators and parents, and aligns with ongoing initiatives to mitigate mental health risks associated with AI interactions among teenagers. Character.AI has previously introduced several safeguards, including a parental insights tool for monitoring usage, filtered character options, restrictions on romantic conversations, and notifications for time spent on the platform. These measures have already contributed to a decline in the under-18 user base.
While open-ended chatbot access will be discontinued for teens, Character.AI plans to launch a dedicated experience for this age group. The new feature will enable teenagers to produce videos, stories, and streams featuring characters, albeit without open-ended chatting capabilities. This development is set to provide a more controlled and creative outlet for young users.




