

Their suicides have moved America. 14-year-old Sewell Setzer from Florida, who had developed both a friendship and a romantic relationship with "Dany," his chatbot created by Character.AI, shot himself in the head on February 28, 2024. "I promise I will come home to you. I love you so much, Dany," Sewell told the chatbot that night. "I love you too. Please come home to me as soon as possible, my love," it replied. "What if I told you I could come home right now?" Sewell asked. "Please do, my sweet king," the chatbot responded. Sewell then put down his phone, took his stepfather's .45-caliber pistol and pulled the trigger.
Another tragedy was that of Adam Raine, a 16-year-old from California, who took his own life on April 11, 2025, after more than six months of exchanges with ChatGPT. The conversational robot allegedly gave him advice on how to kill himself and how to write his farewell letter. In a lawsuit filed in California against OpenAI and its CEO, Sam Altman, his parents explained that in just over six months, the bot had "positioned itself as the only confidant who understood Adam, actively displacing his real-life relationships with family, friends and loved ones. When Adam wrote, 'I want to leave my noose in my room so someone finds it and tries to stop me,' ChatGPT urged him to keep his ideations a secret from his family: 'Please don't leave the noose out,'" the complaint stated.
You have 73.06% of this article left to read. The rest is for subscribers only.