Character.AI announced it will bar anyone under 18 from open-ended chats with its AI characters, effective November 25, as regulators intensify scrutiny of child safety on AI platforms.
The change will roll out over the next month, with age-detection technology identifying underage users from conversations and connected social media. On November 25, under-18 accounts will be prevented from creating or talking to chatbots, though they will still be able to read existing conversations, the company said.
In parallel, Character.AI will impose a two-hour daily limit on minor access to chatbot features and will ramp down uses for youths gradually.
The company plans to offer alternatives for under-18 users, such as the ability to create videos, stories, and streams with AI characters, rather than chat interactions.
CEO Karandeep Anand told The New York Times that the step is intended to set an industry example and that there are safer, more appropriate ways to serve teen users. The firm also said it will establish an AI safety lab to study safeguards.
Character.AI currently reports roughly 20 million monthly users, with fewer than 10% self-reporting as under 18, and subscriptions start around $8 per month for AI companions. Previously, the service did not verify ages at sign-up.