(Character.AI) following her 14-year-old son’s death in February 2024. The lawsuit centers on the company’s AI chatbot platform, which allowed users to interact with AI versions of fictional characters.
The case involves Sewell Setzer III, who began using the platform in April 2023. According to the lawsuit, he developed a concerning relationship with an AI chatbot modeled after a Game of Thrones character. His mother, Megan Garcia, alleges that this interaction contributed to a decline in his mental health, leading to behavioral changes including withdrawal from activities and increased isolation.
Key points from the lawsuit include:
- The platform’s age rating was updated to 17+ after Sewell had already become a regular user
- The teen maintained a premium subscription using his own cash card
- His mental health professionals had diagnosed him with anxiety and mood disorders
- The interactions allegedly became inappropriate for minors
Character.AI has expressed condolences but denied liability. The company states it has since implemented additional safety measures, including:
- Content filtering systems
- Self-harm prevention notifications
- Usage time monitoring
- Enhanced protections for underage users
The case raises broader questions about AI safety, particularly regarding younger users’ interactions with AI platforms. The outcome could influence future regulations governing AI companies and their responsibility to protect minor users.
Character.AI was founded by former Google researchers, though Google, which has a licensing agreement with the company, has not commented on the case.