Could AI push us to behave dangerously? It’s a question that arises after a 14-year-old Florida boy became trapped in a relationship with a fictional character who chatted with him through a chatbot. Throughout the conversations, complicity was built with the machine, but it would have pushed him to commit the irreparable: commit suicide. According to the Reuters agency, his mother filed a complaint against the company behind this artificial intelligence, Character.AI.
The ability to talk to anyone.
Character.AI, is a platform where users can chat with chatbots. The particularity of the latter is that each of them has the traits of a character, fictional (Thor or Sherlock Holmes) or real (Elon Musk or Nietzsche), of a profession (travel agent or bookseller) or even objects that are at less original (like a piece of cheese or a chair).
In this ocean of chatbots with different personalities, powered by artificial intelligence, the young American Sewel Setzer has set his sights on Daenerys Targaryen, one of the main characters of the book series. game of Thrones and its television adaptation, game of Thrones. Over time, the teenager developed an emotional bond with the fictitious young woman.
As the New York Times tells it, Sewell even ended up becoming “addicted” to this AI and those around him knew nothing about this relationship. The young man, however, showed suspicious behavior: he became increasingly isolated at home and his grades at school plummeted. In fact, he spent hours, day and night, talking to Daenerys, whom he ended up nicknamed Dany.
When he was just a child, the American was diagnosed with Asperger syndrome, an autism spectrum disorder. A disorder that, however, did not have any impact on his mental health according to his parents. It wasn’t until adolescence that he developed mood and anxiety disorders.
According to the American newspaper, he only confided in Dany, revealing his explicitly mentioned suicidal intentions. On February 28, 2024, Sewell Setzer committed suicide at the age of 14. In a final message to his fictional best friend, the young man asked, “What if I told you I could come home right now?” To which Dany responded: “Do it, my sweet king,” according to the conversations consulted by the New York Times.
Problematic chatbots
After this tragic story was revealed, Character.AI reacted on seriously and continue to add new safety features.
Unfortunately, this is not the first morbid story involving a chatbot. In 2023, the chatbot Eliza was accused of having pushed a Belgian father to suicide. More recently, an American father discovered a chatbot on Character.AI with the image of his daughter who died 18 years earlier and with which 69 people had chatted.
Source: BFM TV