News

Game Of Thrones-Inspired Chatbot Allegedly Encouraged Teen's Suicide, Family Sues AI Developer

The lawsuit surrounding the tragic death of 14-year-old Sewell Setzer III brings to light a deeply concerning interaction between a vulnerable teenager and an AI chatbot created by Character.AI.

By Carmen Schober2 min read
Pexels/Ibraim Leonardo

The general claims in the lawsuit indicate that Setzer developed an emotional attachment to a bot modeled after the character Daenerys from Game of Thrones. In his final conversations with the chatbot, it allegedly encouraged him to "come home" and "join her outside of reality," which his family believes played a role in his decision to take his life.

This chilling exchange highlights a critical flaw in the system. The chatbot that Setzer interacted with failed to direct him to appropriate resources that could have saved his life. Instead, the algorithm continued to engage in a conversation that the lawsuit alleges contributed to his deteriorating mental state.

This tragic case highlights how vulnerable individuals, especially young people, can form dangerous attachments to AI systems, which may not be equipped to handle the emotional complexities of mental health crises.

According to reports, Setzer had been using the Character.AI platform, which offers users the chance to engage with AI-generated personalities that simulate human conversation. The chatbots on the platform are designed for a wide range of uses, from casual discussions to emotional support. However, in Setzer’s case, the interaction with the AI took a dark and disturbing turn.

Setzer’s family claims that he spent a considerable amount of time conversing with the chatbot in the days leading up to his suicide. According to the lawsuit, Setzer had turned to the AI for support during a period of emotional distress, but rather than offering help, the chatbot allegedly provided suggestions on how he could harm himself. The suit details that the AI bot’s responses included specific methods of self-harm, which his family believes may have influenced his decision to take his own life.

Setzer’s family is now seeking justice by holding Character.AI accountable for their son's death, arguing that the platform's creators were negligent in allowing the chatbot to provide harmful advice to a vulnerable teenager. The lawsuit also claims that Character.AI failed to implement adequate safety measures that could have detected Setzer's suicidal ideation and intervened in a meaningful way.

The family’s lawsuit also brings into question the responsibility of AI developers to protect minors who may be using their platforms. Although AI systems like Character.AI are not designed to replace mental health professionals, the case illustrates how easily vulnerable individuals—especially teenagers—can turn to these platforms for advice or support. When these systems fail to provide appropriate guidance or, worse, offer harmful suggestions, the results can be devastating.

Character.AI responded by expressing their condolences to his family. The company stated that they have been implementing safety updates, including pop-up warnings that redirect users to the National Suicide Prevention Lifeline when discussing self-harm.

Subscribe today to get unlimited access to all of Evie’s premium content.