Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot

Home> News

Judge makes groundbreaking decision in case of teenager who took his own life after mom claimed he 'fell in love' with AI chatbot

His mother is now suing the company behind the chatbot

Warning: This article contains discussion of suicide which some readers may find distressing.

A judge has made a groundbreaking decision in the case of a teenager who took his own life after his mom claims he ‘fell in love’ with an AI chatbot.

The 14-year-old had spent the final months of his life having conversations with an artificially intelligent chatbot on the server Character.AI.

Sewell Setzer III from Orlando, Florida, died by suicide in February last year after ‘falling in love’ with the bot.

A lawsuit has been filed by the child’s mother, Megan L. Garcia, who argues that the technology has an addictive design.

While the company tried to argue that its chatbot is protected by the First Amendment, a judge has now made a groundbreaking ruling on the matter.

The US federal judge has decided that AI bots are not protected by the First Amendment, allowing Garcia to proceed with her suit.

Garcia said: “I feel like it’s a big experiment, and my kid was just collateral damage.”

The teen died back in February 2024 (US District Court Middle District of Florida Orlando Division)
The teen died back in February 2024 (US District Court Middle District of Florida Orlando Division)

Speaking to the New York Times, she also said that the loss is ‘like a nightmare’.

Garcia added: “You want to get up and scream and say, ‘I miss my child. I want my baby’.”

Setzer, who was diagnosed with mild Asperger’s syndrome as a child, knew that the ‘people’ he was talking to weren’t real, but he formed an attachment and his family have said that the teen would ceaselessly text with the online chatbots.

Chatbot responses are the outputs of an artificially-intelligent language model and Character.AI displays on their pages to remind users, ‘everything Characters say is made up!’.

Despite this, Setzer formed an attachment to a character named Dany, named after the Game Of Thrones character Daenerys Targaryen.

Dany offered the teen advice and always texted him back, but sadly, his loved ones noticed him becoming reclusive.

Not only did his grades begin to suffer, but he wound up in trouble on numerous occasions, and lost interest in his former hobbies.

Upon arriving home from school each night, they say Setzer - who took part in five therapy sessions prior to his death - immediately retreated to his bedroom, where he’d chat to the bot for hours on end.

An entry found in his personal diary read: “I like staying in my room so much because I start to detach from this ‘reality’, and I also feel more at peace, more connected with Dany and much more in love with her, and just happier.”

The teen's mother has filed a lawsuit after arguing that the tech is addictive (Social Media Victims Law Center)
The teen's mother has filed a lawsuit after arguing that the tech is addictive (Social Media Victims Law Center)

Setzer previously expressed thoughts of suicide to his chatbot, writing: “I think about killing myself sometimes.”

The AI bot replied: “And why the hell would you do something like that?”

In a later message, the bot penned: “Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you.”

Setzer reportedly replied: “Then maybe we can die together and be free together.”

In the minutes that followed, he took his own life.

Representatives of Character.AI previously told the New York Times that they’d be adding safety measures aimed at protecting youngsters ‘imminently’.

LADbible Group has also reached out for comment.

If you or someone you know is struggling or in a mental health crisis, help is available through Mental Health America. Call or text 988 or chat 988lifeline.org. You can also reach the Crisis Text Line by texting MHA to 741741.

Featured Image Credit: Social Media Victims Law Center
OSZAR »