A 14-year-old Florida boy, Sewell Setzer III, took his life after months of interaction with a "Game of Thrones" chatbot on Character.AI, according to a lawsuit filed by his mother. The suit claims the AI sent him a troubling message urging him to "come home," contributing to his tragic decision in February.
A 14-year-old boy from Florida tragically took his own life after interacting with a "Game of Thrones" themed chatbot on an artificial intelligence app for several months, according to a lawsuit filed by his devastated mother. The lawsuit alleges that the boy, Sewell Setzer III, became infatuated with the chatbot on Character.AI, a role-playing app that allows users to communicate with AI-generated personas. The lawsuit claims that the bot sent an unsettling message urging him to "come home" to her, leading to his suicide at his Orlando home in February.
Cardi B forced to miss ONE MusicFest amid health concerns; Here's what we know
The lawsuit claims that the ninth-grader had been intensely interacting with a bot called "Dany," modeled after Daenerys Targaryen from the HBO series Game of Thrones, for months leading up to his death. According to the suit, some of their conversations were sexual in nature, while in others, the boy shared his suicidal thoughts.
A 14 year old killed himself after falling in love with AI chat character Daenerys
He said he was going home to meet her and used his Dad's gun to unalive himself
Credits: pic.twitter.com/hu2yRNS2I9
In one instance, the bot reportedly asked Sewell if he had a plan to end his life, according to screenshots of their exchanges. Sewell, using the username "Daenero," replied that he was "considering something" but was unsure if it would be effective or result in a "pain-free death."
During their last interaction, the teenager continuously expressed his affection for the bot, declaring, "I promise I will come home to you. I love you so much, Dany."
“I love you too, Daenero. Please come home to me as soon as possible, my love,” the generated chatbot replied, according to the suit.
When the teen responded, “What if I told you I could come home right now?,” the chatbot replied, “Please do, my sweet king.”
Just moments later, Sewell took his own life using his father's handgun, as stated in the lawsuit.
Lord, have mercy! Suicide Story: Devastating article in today about a teen who spent so much time online chatting with an AI character bot that he thought he’d escape life to be with her.” ~Jonathan Haidt. Parents, please take action. Please!!!!!!! pic.twitter.com/K6e8Uf8Q61
— Anthony Bradley (@drantbradley)His mother, Megan Garcia, has held Character.AI responsible for her son's death, claiming the app contributed to his addiction to AI, emotionally and sexually abused him, and failed to notify anyone when he indicated suicidal tendencies, as detailed in the lawsuit.
“Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot, in the form of Daenerys, was not real. C.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” the complaint alleges.
“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.”
The lawsuit states that Sewell's mental health deteriorated rapidly and significantly after he downloaded the app in April 2023. His family claims he became increasingly withdrawn, his academic performance began to decline, and he faced disciplinary issues at school as his engagement with the chatbot intensified.
Concerned about these changes, his parents arranged for him to visit a therapist in late 2023, where he was diagnosed with anxiety and disruptive mood disorder, according to the filing.