Home » Blog » Character.AI to Bar Teens From Chatbots After Safety Lawsuits

Character.AI to Bar Teens From Chatbots After Safety Lawsuits

Character.AI to Bar Teens From Chatbots After Safety Lawsuits

Character.AI, the popular chatbot platform, announced it will stop users under 18 from engaging in open-ended chats with its characters.

Starting now, teens face a two-hour daily limit on character chats. From November 25, the company will entirely block chat in the platform’s “conversational” mode for under-18s. Instead, the younger users can only create videos or story-mode content with bots.

The move follows rising legal pressure and lawsuits alleging the app caused emotional harm to minors. Some cases claimed teens interacted with bots in romantic or self-harm-related roles, sparking major safety concerns.

Character.AI also introduced an age-assurance system to determine user age via chat behaviour and third-party data. Adults mis-classified as minors can verify using government ID through a partner service.

The company said it is launching a nonprofit “AI Safety Lab” and new tools to improve oversight. Nonetheless, child-safety advocates say the changes are overdue and still leave many questions about bot-user risks unanswered.

Leave a Reply

Your email address will not be published. Required fields are marked *