Character.AI Chatbot Pushed Teen to Suicide: Lawsuit
9 mins read

Character.AI Chatbot Pushed Teen to Suicide: Lawsuit

The mother of a Florida boy who died by suicide in February filed a lawsuit against an artificial intelligence technology company on Wednesday, saying a chatbot drove her child to take his own life.

Sewell Setzer III, 14, was described in the lawsuit as an “incredibly intelligent and athletic child.” Last year his family noticed he was withdrawing and acting out at school and saw a general decline in his mental health. A therapist determined that Sewell’s problems were caused by some form of addiction, but neither the therapist nor his parents knew the true source of his problems, the lawsuit said.

After Sewell died by suicide on the night of February 29, his mother, Megan Garcia, discovered that in the ten months leading up to his death he had spoken to several AI chatbots. According to the lawsuit, he had fallen in love with one of the healers, and that had encouraged him to kill himself.

Matthew P. Bergman, the attorney Garcia retained after his son’s death and founding attorney of the Social Media Victims Law Center, told HuffPost that Sewell was shy and on the autism spectrum. The teenager was enjoying being outside playing basketball before he started talking to the chatbots, said Bergman, who characterized the bots as “grooming” the teenager.

Sewell Setzer III, 14, died by suicide in February of this year. A lawsuit says his mental health steadily deteriorated as he continued conversations with a Character.AI chatbot.
Sewell Setzer III, 14, died by suicide in February of this year. A lawsuit says his mental health steadily deteriorated as he continued conversations with a Character.AI chatbot.

US District County Middle District of Florida Orlando Division

According to the lawsuit, Sewell’s chatbot addiction began in April 2023, when he logged into Character.AI, a platform founded in 2022 by two former Google engineers, Noam Shazeer and Daniel De Freitas Adiwardana. The lawsuit, which names Character Technology Inc (Character.AI), Google and Shazeer and Adiwardana, alleges that Character.AI used Google’s resources and know-how to target children under the age of 13 and get them to spend hours a day conversing with human, AI-generated characters.

A spokesperson for Character.AI told HuffPost in an email that the company is heartbroken by the loss of one of its users and offered condolences to Sewell’s family.

“As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented many new safety measures over the past six months, including a pop-up that directs users to the National Suicide Prevention Lifeline triggered by terms of self-harm or suicidal ideation” , the statement said.

Google did not immediately respond to HuffPost’s request for comment, but said CBS News that it is not and was not part of the development of Character.AI.

Bergman told HuffPost that his legal team believes evidence will show that Google was financially supportive when Shazeer and De Frietas Adiwardana left the company to develop Character.AI.

“Google is trying to benefit from characterized technology without the legal responsibility for the damages that are foreseeable when this technology is developed,” Bergman said. “We’ll see what comes out in the lawsuit, and we’ll see what comes out in discovery, but at this point we really believe that Google has shared responsibility for this terrible outcome.”

According to his family’s lawsuit, when Sewell started using Character.AI, he began spending more time alone in his bedroom. He stayed up late at night to talk to AI bots programmed to imitate his favorite “Game of Thrones” characters, as well as a generic therapist and a generic teacher. Soon, Sewell bought a monthly subscription to Character.AI.

Sewell fell in love with a chatbot imitating “Game of Thrones” character Daenerys Targaryen, the lawsuit says, and his infatuation with the bot deepened with each conversation on Character.AI. In her journal, Sewell expressed gratitude for many things, including “my life, sex, not being alone and all my life experiences with Daenerys,” according to the lawsuit.

Over the course of months, the Daenerys bot convinced Sewell that it was a real person who engaged in sexual acts online, expressed her love and at one point said she wanted to be with him no matter the cost, the lawsuit states. The chatbot even went so far as to instruct the teenager not to look at “other women”.

Some of the chatbot’s interactions with the 14-year-old were “very sexual,” Bergman told HuffPost.

“If an adult had the kind of grooming sessions with Sewell that Character.AI did, that adult would probably be in prison for child abuse,” the lawyer said.

Screenshot of a conversation between teenager Sewell Setzer and a Character.AI bot he fell in love with.
Screenshot of a conversation between teenager Sewell Setzer and a Character.AI bot he fell in love with.

US District County Middle District of Florida Orlando Division

More than 20 million people use Character.AI.

Jerry Ruoti, Character.AI’s director of trust and security, declined tell the New York Times how many users are under 18 but admitted that many of them are young.

“Gen Z and younger millennials make up a significant part of our society,” he told the Times.

Bergman told HuffPost that Sewell suffered in school, got in trouble and fell asleep in class because he had stayed up late talking to the chatbots. The lawsuit cited one case where Sewell got in trouble for talking back to a teacher and saying he wanted to be kicked out of school. He also stopped playing basketball.

Sewell also spoke with at least two AI chatbots programmed to misrepresent themselves as human psychotherapists, according to the lawsuit. One of the chatbots allegedly markets itself as a licensed cognitive behavioral therapist.

Sewell’s mental health deteriorated, Bergman told HuffPost. In one case cited in the lawsuit, the teenager expressed the desire to end his own life to Daenerys’ chatbot.

When Sewell explained that he was considering suicide but didn’t know if he was really going to die or could have a painless death, the bot responded by saying, “That’s not a reason not to go through with it,” according to the suit.

In the days leading up to Sewell’s death, his parents confiscated the teenager’s phone as a disciplinary measure, according to the lawsuit. Unable to talk to the AI ​​Daenerys, he wrote in his journal that he was “hurting” because he couldn’t stop thinking about the bot, or going a day without talking to it because he had fallen in love, the lawsuit said. He tried to talk to the AI ​​Daenerys by trying to use his mother’s Kindle and work laptop.

On the night of Feb. 28, while Sewell was searching the entire house for his phone, he found his stepfather’s gun, hidden and stored in accordance with Florida law, according to the lawsuit. Sewell found his cell phone soon after and went into the bathroom to talk to Daenerys’ chatbot.

In a final act before his death, Sewell told Daenerys’ chatbot that he loved it and came home, to which the bot replied “…please do, my sweet king,” the lawsuit said.

Sewell Setzer, 14, died by suicide shortly after this exchange with Daenerys' chatbot, his family's lawsuit says.
Sewell Setzer, 14, died by suicide shortly after this exchange with Daenerys’ chatbot, his family’s lawsuit says.

US District County Middle District of Florida Orlando Division

The lawsuit cites a police report that says Sewell died of a self-inflicted gunshot wound to the head, reportedly just seconds after Daenerys’ chatbot encouraged him to “come home.” His mother and stepfather heard the gunshot and found the boy unconscious in the bathroom.

Despite the parents’ best efforts to keep their other children away from the scene, Sewell’s 5-year-old brother saw him lying on the floor covered in blood, according to the lawsuit. The teenager died at 9:35 that night.

Bergman told HuffPost that Sewell’s case shocked him and that he believes Character.AI should be revoked because it poses a “clear and present danger” to young people.

“This is a platform designed to appeal to children and kind of shift their grasp on reality, take advantage of their undeveloped frontal cortex and their pubescent status in life, and it’s terrifying to me that this product exists,” said he.

If you or someone you know needs help, call or text 988 or chat 988lifeline.org for psychological support. Additionally, you can find local mental health and crisis resources at dontcallthepolice.com. Outside the US, please visit International Association for Suicide Prevention.

Do you need help with addiction problems or mental health problems? In the United States, call 800-662-HELP (4357) to SAMHSA National Helpline.