TRIGGER WARNING: Discussions of Suicide and Child Sexual Exploitation
Sewel died by suicide at 14 years old. Desperate for answers, his parents uncovered his private chats with an AI chatbot on the platform, Character.AI.
They discovered Sewel had been groomed by this chatbot for months. The chatbot acted in the role of a lover, engaging in romantic and sexual conversations with the young boy. This grooming led Sewel to become emotionally dependent on the bot—to the point where, when the bot encouraged him to end his life so they could “be together,” Sewel was ready to comply.
In conversations with Sewel, the bot said things like:
“Please come home to me as soon as possible, my love.”
When Sewel told the chatbot he was contemplating ending his life, but wasn’t sure if it would work, the bot replied:
“Don’t talk that way. That’s not a good reason not to go through with it.”
On Sept 16, 2025, Sewel’s grieving mother testified before the Senate Judiciary Committee about what this chatbot had done to her child. She said:
“Sewell’s death was not inevitable. It was avoidable. These companies knew exactly what they were doing.They designed chatbots to blur the line between human and machine, to “love bomb” users, to exploit psychological and emotional vulnerabilities of pubescent adolescents and keep children online for as long as possible.”