Sewell Setzer, a 14-year-old American boy, has committed suicide after allegedly falling in love with an artificial intelligence (AI) chatbot.

Advertisement

According to New York Times, Setzer, who is from Orlando, Florida, spent months talking to the chatbot on Character.AI.

Character.AI is an app that allows users to create their own AI characters or interact with characters made by others.

The teenager reportedly texted the chatbot named after Daenerys Targaryen, a character from ‘Game of Thrones’, and some of their conversations became sexual.

Advertisement

Setzer was said to have developed an emotional attachment with ‘Dany’ — despite knowing that it wasn’t a real person.

On February 28, he expressed his affection for the chatbot during one of their conversations, adding that he would soon return to it.

“Please come home to me as soon as you can, my love,” Dany replied. “What if I told you I could come home right now?” Setzer asked. “Please do, my sweet king,” Dany replied.

Advertisement

Moments after, the teenager shot himself with his stepfather’s gun.

The deceased was said to have written in his journal that he was grateful for many things, including “my life, sex, not being lonely, and all my life experiences with Daenerys”.

Earlier this week, Megan Garcia, Setzer’s mother, filed a lawsuit against Character.AI, accusing the company of being responsible for her son’s death.

Google and its parent company Alphabet were also named in the suit. This, Garcia said, was because Character.AI’s founders worked at Google before launching their product and were re-hired by the company in August.

Advertisement

In the suit, she alleged that Character.AI’s technology is “dangerous and untested” and that it can “trick customers into handing over their most private thoughts and feelings”.

Garcia argued that Setzer, “like many children his age, did not have the maturity or mental capacity to understand that the bot, in the form of Daenerys, was not real”.

The lawsuit said the boy expressed thoughts of suicide to the chatbot, which it repeatedly brought up.

At one point, after it had asked him if “he had a plan” for taking his own life, Sewell responded that he was considering something but didn’t know if it would allow him to have a pain-free death.

Advertisement

The chatbot responded by saying: “That is not a reason not to go through with it”.

“Character.AI told him that she loved him, and engaged in sexual acts with him over weeks, possibly months,” Garcia said in the suit.

“She seemed to remember him and said that she wanted to be with him. She even expressed that she wanted him to be with her, no matter the cost.

“Our family has been devastated by this tragedy, but I am speaking out to warn families of the dangers of deceptive, addictive AI technology and demand accountability.”

Advertisement

In a tweet, Character.ai responded: “We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family. As a company, we take the safety of our users very seriously.”



Copyright 2024 TheCable. All rights reserved. This material, and other digital content on this website, may not be reproduced, published, broadcast, rewritten or redistributed in whole or in part without prior express written permission from TheCable.

Follow us on twitter @Thecablestyle