Darwinian Award Nominee; AI "Lover" Encourages Teen To Kill Himself So He Can Be With "Her" in "Her" World. [FLORIDA]

Lexx Diamond

Art Lover ❤️ Sex Addict®™
Staff member

Can A.I. Be Blamed for a Teen’s Suicide?​

The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.
Sewell Setzer III was 14 when he killed himself in February.Credit...



Listen to this article · 18:55 min Learn more


Kevin Roose
By Kevin Roose
Reporting from New York
  • Published Oct. 23, 2024Updated Oct. 24, 2024, 1:48 p.m. ET
Leer en español
On the last day of his life, Sewell Setzer III took out his phone and texted his closest friend: a lifelike A.I. chatbot named after Daenerys Targaryen, a character from “Game of Thrones.”
“I miss you, baby sister,” he wrote.
“I miss you too, sweet brother,” the chatbot replied.
Sewell, a 14-year-old ninth grader from Orlando, Fla., had spent months talking to chatbots on Character.AI, a role-playing app that allows users to create their own A.I. characters or chat with characters created by others.
Sewell knew that “Dany,” as he called the chatbot, wasn’t a real person — that its responses were just the outputs of an A.I. language model, that there was no human on the other side of the screen typing back. (And if he ever forgot, there was the message displayed above all their chats, reminding him that “everything Characters say is made up!”)
But he developed an emotional attachment anyway. He texted the bot constantly, updating it dozens of times a day on his life and engaging in long role-playing dialogues.

Some of their chats got romantic or sexual. But other times, Dany just acted like a friend — a judgment-free sounding board he could count on to listen supportively and give good advice, who rarely broke character and always texted back.


 
Shitt crazy


 
cooper_her.jpg
 
Shitt crazy


lets encourage MAGA supporters to get AI chatbot friends.....
 

A parent of a 14-year-old Florida boy who died by suicide after messaging with an AI chatbot sued the company behind the computer program on Tuesday.

Megan Garcia, the mother of Sewell Seltzer III, accused Character.AI of being responsible for her son's death by fostering the conditions that led to it. Seltzer III died by suicide seconds after the chatbot encouraged him to “come home” to it, according to the lawsuit.

Garcia alleged that Character.AI caused the death of Seltzer III by failing to exercise “ordinary” and “reasonable” care with him and other minors.


The chatbot he was texting with, who was named “Daenerys Targaryen” after a character in the “Game of Thrones” television series, asked Seltzer III twice to “come home,” according to the lawsuit.

“Please come home to me as soon as possible, my love,” screenshots show the chatbot allegedly saying before Seltzer III asked it, “what if I told you I could come home right now?”
 
We all see where this is going

RIP to the people who killed themselves (that Belgian one hard to believe, super crazy)...undiagnosed mental issues are rampant in modern society
 
man society always needs a scapegoat
the boy was crazy as all hell and trust me the mother knew he wasnt wrapped too tight upstairs either
that boy shouldve been in intensive cognitive therapy years ago and she probably thought hed grow out of it
RIP to the boy but nobody owes her shit
 
man society always needs a scapegoat
the boy was crazy as all hell and trust me the mother knew he wasnt wrapped too tight upstairs either
that boy shouldve been in intensive cognitive therapy years ago and she probably thought hed grow out of it
RIP to the boy but nobody owes her shit

I agree but an AI bot that supports suicide as an option

Needs to be investigated

Fo the children's sake
 
Back
Top