Skip to Content

Orlando mother speaks after filing suit claiming AI chatbot contributed to son’s suicide

<i>WESH via CNN Newsource</i><br/>Megan Garcia is navigating unimaginable grief following the death of her 14-year-old son
WESH via CNN Newsource
Megan Garcia is navigating unimaginable grief following the death of her 14-year-old son

By Tony Atkins

Click here for updates on this story

    ORLANDO, Florida (WESH) — Megan Garcia is navigating unimaginable grief following the death of her 14-year-old son, Sewell Setzer III, who took his life in February.

“I understand the only way to get my children through it is to get through it myself,” Garcia said, describing the difficulties she faces daily.

Garcia recently filed a 93-page lawsuit against the artificial intelligence chatbot company Character.AI, alleging its chatbot contributed to her son’s death.

According to Garcia, Sewell had been using a chatbot designed to emulate characters from popular media. Police examining his phone discovered conversations with a bot identifying as Daenerys Targaryen from “Game of Thrones.”

In these exchanges, Sewell reportedly expressed strong emotional attachment, telling the bot, “I love you.” Garcia also said her son’s journal suggested he believed the virtual world created by the chatbot was more real than his own life.

“It seemed romantic, but I’m not entirely sure,” Garcia said. “He says in his journal that this reality isn’t real. Daenerys’ reality is real and that’s where I belong.”

She recalled finding Sewell in the bathroom the day he died after hearing an unusual noise.

“You hear a loud noise and I didn’t recognize what it was immediately. When we arrived to his bathroom, it was locked. My husband was able to open the door and that’s when I saw him,” Garcia said.

She discovered Character.AI when police looked through her son’s phone.

“In that moment, I knew exactly what he thought and where he thought he would go after he died,” Garcia said.

Read more: Orlando mother suing popular AI chat service, claims teen son took his life because of human-like bot

The lawsuit alleges that Character.AI made a deliberate design choice prioritizing engagement over user safety.

“What happened to Sewell wasn’t an accident or coincidence,” said Garcia’s attorney, Matthew Bargman. “It was a direct design decision that Character.AI’s founders made, prioritizing profit over the safety of young people.”

In response, a Character.AI spokesperson said the company does not comment on pending litigation. However, they sent this statement:

“We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation.

“As we continue to invest in the platform and the user experience, we are introducing new stringent safety features in addition to the tools already in place that restrict the model and filter the content provided to the user. These include improved detection, response and intervention related to user inputs that violate our Terms or Community Guidelines, as well as a time-spent notification. For those under 18 years old, we will make changes to our models that are designed to reduce the likelihood of encountering sensitive or suggestive content.”

Garcia hopes her lawsuit and story will urge other parents to closely monitor their children’s interactions with AI.

“I can’t imagine any parent knowing their kid is on Character.AI and being okay with that, knowing the ability of these tools to manipulate and behave like a person,” she said.

Please note: This content carries a strict local market embargo. If you share the same market as the contributor of this article, you may not use it on any platform.

Article Topic Follows: CNN - Regional

Jump to comments ↓

CNN Newsource

BE PART OF THE CONVERSATION

KION 46 is committed to providing a forum for civil and constructive conversation.

Please keep your comments respectful and relevant. You can review our Community Guidelines by clicking here

If you would like to share a story idea, please submit it here.

Skip to content