Boy, 14, 'killed himself after becoming obsessed with Game of Thrones A.I chatbot'

24 October 2024, 09:19 | Updated: 24 October 2024, 11:48

Sewell Setzer III and his mother
Sewell Setzer III and his mother. Picture: Social Media Victims Law Center

By Henry Moore

The mum of a teenage boy who killed himself after becoming infatuated with an artificial intelligence chatbot is suing its creators.

Listen to this article

Loading audio...

Sewell Setzer III, 14, took his own life in Orlando, Florida this February after spending months obsessed with the chatbot.

His mother, Megan Garcia, filed a civil suit against Character.ai, the creator of the AI-powered bot, alleging the company was complicit in the teen’s death.

“A dangerous AI chatbot app marketed to children abused and preyed on my son, manipulating him into taking his own life,” Garcia said in a press release.

Read more: Facebook and Instagram launch technology to crack down on celebrity scam adverts

“Our family has been devastated by this tragedy, but I’m speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability from Character.AI, its founders, and Google.”

Character.ai allows you to speak with digital versions of fictional characters.
Character.ai allows you to speak with digital versions of fictional characters. Picture: Character.ai

Taking to X, formerly known as Twitter, Character.ai said: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.”

The tech brand has denied Garcia’s allegations.

In the months before his death, Setzer formed an obsessive relationship with the chatbot which he nicknamed Daenerys Targaryen, a character in Game of Thrones.

Garcia alleges Character.ai created a product that not only failed to flag her son’s mental health problems but made them worse.

In messages revealed in the lawsuit, the teen’s worsening mental state is made clear.

“I miss you, baby sister,” he wrote, reports the New York Times.

“I miss you too, sweet brother,” the chatbot responded.

Later, he wrote in his journal: “I like staying in my room so much because I start to detach from this ‘reality,’ and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

The Character.AI logo
The Character.AI logo. Picture: Getty

On February 28, Setzer told the chatbot that loved her and would soon “come home.”

“Please come home to me as soon as possible, my love,” Dany replied.

“What if I told you I could come home right now?” Sewell asked.

“… please do, my sweet king,” Dany replied.

He then took his stepfather’s .45 calibre gun and shot himself.