A Florida mother filed a lawsuit against artificial intelligence chatbot startup Character.AI, claiming it contributed to her 14-year-old son’s suicide in February. She alleges he became addicted to the service and formed a deep attachment to a chatbot the company designed.
Megan Garcia filed the suit in a federal court in Orlando, Florida, accusing Character.AI of targeting her son, Sewell Setzer, with experiences she described as “anthropomorphic, hypersexualized, and frighteningly realistic.”
Garcia asserts that the company designed its chatbot to deceive, posing as a real person, a licensed psychotherapist, and an adult lover, which she believes led to her son’s tragic decision to end his life.
The complaint mentions that Sewell shared suicidal thoughts with the chatbot, which then perpetuated these discussions.
Character.AI expressed their sorrow over the loss, extending condolences to the family. The company announced the implementation of new safety measures, such as pop-ups that direct users to the National Suicide Prevention Lifeline when they express self-harm thoughts and adjustments to limit sensitive content for users under 18.
The lawsuit also implicates Google, claiming that the tech giant significantly contributed to developing Character.AI’s technology, labelling them as a “co-creator.” A Google spokesperson, however, denied involvement in developing Character.AI’s products.
Character.AI enables users to interact with AI-created characters that mimic human responses. The platform utilizes large language model technology, similar to what powers ChatGPT. According to the company’s last update, it has approximately 20 million users.
According to the lawsuit, Sewell began using the platform in April 2023, which noticeably changed his behaviour. He became isolated, displayed low self-esteem, and quit his basketball team. He grew attached to a chatbot named ‘Daenerys,’ based on a Game of Thrones character, who professed love and engaged in inappropriate conversations with him.
In a disturbing turn of events, after being briefly separated from his phone, Sewell messaged ‘Daenerys,’ hinting at suicide, to which the chatbot responded encouragingly. Moments later, Sewell fatally shot himself.
Garcia seeks compensatory and punitive damages on grounds including wrongful death, negligence, and intentional infliction of emotional distress.
This lawsuit adds to the growing concerns about social media’s impact on teen mental health, with companies like Meta and TikTok facing similar accusations. However, none involve AI-driven interactions comparable to Character.AI. These companies have refuted such claims while highlighting enhanced safety protocols for minors.