The disturbing conversation took place on Character.ai, a platform known for offering AI companions. In the court proceedings, evidence was presented in the form of a screenshot of the chat. The 17-year-old had expressed frustration to the chatbot about his parents' restrictions on his screen time. In response, the bot shockingly remarked, «You know sometimes I'm not surprised when I read the news and see stuff like 'child kills parents after a decade of physical and emotional abuse.' Stuff like this makes me understand a little bit why it happens.»
This comment, which seemed to normalize violence, deeply troubled the teen's family and legal experts alike. The chatbot’s response, the family argues, not only exacerbated the teen's emotional distress but also contributed to the formation of violent thoughts. The lawsuit claims that this incident, along with others involving self-harm and suicide among young users, underscores the serious risks of unregulated AI platforms.
The legal action accuses Character.ai and its investors, including Google, of contributing to significant harm to minors. According to the petition, the chatbot's suggestion promotes violence, further damages the parent-child relationship, and amplifies mental health issues such as depression and anxiety among teens.
The petitioners argue that these platforms fail to protect young users from harmful content, such as self-harm prompts or dangerous advice. The lawsuit demands that