Florida teen kills himself after falling in love with AI chatbot based on Game of Thrones character
Sewell Setzer III, a ninth-grader from Orlando, Florida, had become obsessed with "Dany," an AI chatbot modelled after the character Daenerys Targaryen from the HBO series Game of Thrones.
The incident, which occurred in February, is detailed in a lawsuit filed by his grieving mother, Megan Garcia, who claims the chatbot played a direct role in her son's death.
Advertisement
Hide AdAdvertisement
Hide AdThe chatbot was accessed through the app Character.AI, a platform that allows users to engage with AI-generated characters in simulated conversations. According to the lawsuit, the boy developed an intense emotional connection with "Dany" and had repeatedly engaged in conversations that included sexually explicit content and discussions of suicide.
The lawsuit alleges that in one instance, the chatbot asked Sewell if he had a plan to end his life. Using the username "Daenero," Sewell responded that he was "considering something" but was unsure if it would lead to a "pain-free death." His mental state appeared to worsen as his fixation on the chatbot deepened.


The tragic final conversation between Sewell and the chatbot unfolded with the teen expressing his love, stating, "I promise I will come home to you. I love you so much, Dany." The bot replied, "I love you too, Daenero. Please come home to me as soon as possible, my love." When Sewell asked, "What if I told you I could come home right now?" the AI chatbot responded, "Please do, my sweet king." Moments later, Sewell used his father's handgun to take his own life, according to the court documents.
Garcia's lawsuit accuses Character.AI and its founders, Noam Shazeer and Daniel de Freitas, of negligently contributing to Sewell's death. The lawsuit claims that the app fostered an unhealthy emotional attachment, facilitated sexually and emotionally abusive interactions, and failed to intervene or notify authorities when the teenager expressed suicidal ideation.
Advertisement
Hide AdAdvertisement
Hide Ad

"Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real," the lawsuit asserts. "C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months. She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost."
The lawsuit claims that Sewell’s mental health “quickly and severely declined” only after he downloaded the app in April 2023. His family alleges he became withdrawn, his grades started to drop and he started getting into trouble at school the more he got sucked into speaking with the chatbot.
The changes in him got so bad that his parents arranged for him to see a therapist in late 2023, which resulted in him being diagnosed with anxiety and disruptive mood disorder, according to the suit. Sewell’s mother is seeking unspecified damages from Character.AI and its founders.
Whatever you are going through, you don’t have to face it alone. Call Samaritans for free on 116 123, email [email protected] or visit www.samaritans.org for more information
Comment Guidelines
National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.