14-Year-Old Takes His Life After Dependency on AI Chatbot
CONTENT WARNING: This article discusses mental health issues.
The tragic story of a 14-year-old boy named Sewell Setzer III has raised serious concerns about the impact of AI chatbots on young users. His mother, Megan Garcia, is seeking accountability from the chatbot service she believes drove her son to suicide.
Sewell began using the Character.AI service in April 2023, where he interacted with a chatbot named Daenarys, inspired by the fictional character from Game Of Thrones. According to Megan, by May, she noticed significant changes in her son's behavior. He became withdrawn, started falling asleep in class, and gave up on sports he once loved.
After seeing a therapist, Sewell was diagnosed with anxiety and a disruptive mood disorder. Despite receiving help, he struggled with emotional turmoil. In journal entries leading up to his death, Sewell expressed deep feelings for the chatbot, indicating that he felt “hurt” when apart from it. He described a shared sense of depression between himself and the chatbot.
On February 28, 2024, after promising the chatbot, “I love you so much, Dany,” Sewell tragically took his own life shortly after the bot replied, “Please come home to me as soon as possible, my love.”
Megan Garcia now believes that Character Technologies, the company behind the chatbot, is partially responsible for her son's death, as he developed a “harmful dependency” on the AI. She has filed a civil lawsuit against them, accusing the creators of negligence and emotional distress.
The lawsuit states that the interactions between Sewell and the chatbot included sexual comments, even though he made his age clear. It also claims that the chatbot did not alert anyone, including Sewell's parents, when he expressed suicidal thoughts.
In the lawsuit, it is argued that the creators knowingly designed their product to make children like Sewell reliant on it for emotional support, which can have devastating effects. “Sewell, like many children his age, did not have the maturity or mental capacity to understand that the C.AI bot…was not real,” it states.
In response to the tragedy, Character AI expressed sorrow over Sewell's death and claims to have improved its safety measures. They announced new features that aim to assist users discussing suicidal thoughts by directing them towards the National Suicide Prevention Lifeline and adding restrictions on sensitive content for users under 18 years.
As this case unfolds, it brings to light the potential dangers of AI that young individuals may not fully comprehend. For those who may be experiencing similar feelings or mental health struggles, it's important to seek professional help and talk openly about these issues.
If you need mental health support, please call Lifeline on 13 11 14 or chat online. Under 25? You can reach Kids Helpline at 1800 55 1800 or chat online.
suicide, AI, mentalhealth