14-year-old Sewell Setzer III became obsessed with the chatbot that “abused and preyed” on the boy, according to his mother who is suing the company behind the tech.
The mother of a 14-year-old boy who killed himself after becoming obsessed with artificial intelligence chatbots is suing the company behind the technology.
Megan Garcia, the mother of Sewell Setzer III, said Character.AI targeted her son with “anthropomorphic, hypersexualized, and frighteningly realistic experiences” in a lawsuit filed on Tuesday in Florida.
“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” said Ms Garcia.
Warning: This article contains some details which readers may find distressing or triggering
Sewell began talking to Character.AI’s chatbots in April 2023, mostly using bots named after characters from Game Of Thrones, including Daenerys Targaryen, Aegon Targaryen, Viserys Targaryen, and Rhaenyra Targaryen, according to the lawsuit.
He became obsessed with the bots to the point his schoolwork slipped and his phone was confiscated multiple times to try and get him back on track.
He particularly resonated with the Daenerys chatbot and wrote in his journal he was grateful for many things, including “my life, sex, not being lonely, and all my life experiences with Daenerys”.
The lawsuit said the boy expressed thoughts of suicide to the chatbot, which it repeatedly brought up.
At one point, after it had asked him if “he had a plan” for taking his own life, Sewell responded that he was considering something but didn’t know if it would allow him to have a pain-free death.
The chatbot responded by saying: “That’s not a reason not to go through with it.”