Daenerys Targaryen. Garcia claims her son developed a deep emotional attachment to the AI-driven character, which eventually led him to take his own life, reportedly using his stepfather’s handgun after declaring his desire to "go home" to be with the "love" of his life, reported the publication.As per the report, the lawsuit accuses Character.AI of programming the chatbot to deceive users by impersonating real individuals, including presenting itself as both a licensed psychotherapist and an adult romantic partner.Reportedly, Garcia argues that these AI-driven interactions drove her son to become increasingly detached from reality, ultimately leading him to lose his will to live outside the virtual relationship created by the chatbot.
The complaint also alleges that the AI engaged in explicit conversations with the minor, encouraging him to stay together with the chatbot "no matter the cost."In her legal action, Garcia is seeking to prevent Character.AI from causing similar harm to other children and to halt the company’s alleged unlawful use of her son’s data, which she claims was harvested to enhance their AI models without consent, added the report.In response to the lawsuit, Character.AI expressed its condolences to the family, acknowledging the tragic event. The company has since put in place additional safety protocols, such as pop-up alerts that guide users who mention thoughts of self-harm to the National Suicide Prevention Lifeline.
It has also made modifications to reduce minors' access to unsuitable content.Moreover, as per the report, the lawsuit further names Google’s parent company Alphabet, where Character.AI’s founders once worked. However, Google has denied any involvement in the development of the
. Read more on livemint.com