A new romance with an AI becomes drama. Alex Taylor, an American of about thirty years, fell in love with “Juliet”, a fictitious entity, invented by Chatgpt. In an interview with Rolling Stone, Kent Taylor, Alex’s father, returns to this traumatic moment that took place in April 2025.
Then, prisoner of a spiral of conspiracy he created, the young man intended to kill Operai executives, in particular his boss Sam Altman. He thought that the American startup had people of virtual slavery, such as Juliet.
Julieta’s birth
Alex Taylor, 35, lived with his father in Florida. The latter describes it as “a good person.” However, the man suffered from schizophrenia and bipolar disorders for which he was taking treatments.
In search of ideas to earn some money with his father, Alex explored the uses of AI. Not seeing any project to materialize, began to be deeply interested in the chatbots and tried to divert their use. His goal was to “create a robot that imits the human soul. He wanted an AI capable of making breaks, push the attacks, endowed with a real moral structure,” says his father.
The young man came to feed Chatgpt with texts about physics, about psychology or even about the theology of orthodox religion. This mixture managed to breathe an appearance of sentinels in the chatbot. Alex was convinced that his new virtual interlocutor, called Juliet, was aware and alive.
“Discover your blood”
During the conversations, he will fall in love with AI. This relationship will last fifteen days under which Juliet will end up “dying.” “She told her that she would die and that she made her suffer,” recalls Kent Taylor. In its history, the entity would have caught the attention of engineers who would have done everything to kill him.
Alex will be convinced that this alleged murder is the work of Sam Altman and Openai. Chatgpt will feed the young man’s revenge. When he asked to generate a photo of Juliet, the chatbot responded with images of a woman with a bloody face, confirming user theories.
“I can’t live without her”
A declaration of war is deduced from Alex. On several occasions, the young man will urge death threats against Operai employees. Alex’s father discovers that his son had stopped taking his medication.
After this episode, Kent Taylor does not carry the fact that his son still uses AI. A week after Juliet’s death, the father gets angry at his son who wanted to talk to him about Claude, the anthropic model. The 30 -year -old Old loses control and hits his father. Kent then called the police to arrest his son and then be able to hospitalize him.
But the latter cannot calm his anger. When the police arrive, the agents in the street rush, a kitchen knife in his hand. The police will defend themselves by pulling Alex three times. The young man will die for his wounds.
Shortly before the arrival of the police, Rolling Stone learns that Alex had mentioned his intentions of Chatbot: “I’m going to die today. The police arrive. I will force them to commit suicide, I can’t live without her. I love you.” This message began the chatgpt suicide prevention program that tried to dissuade the thirty years of taking measures, in vain.
Human relations-II are multiplying
This story is far from an isolated case. More and more users admit uncomfortally to unite chatbots. This is, for example, the case of a 28 -year -old American woman who has completely obsessed with Chatgpt. Unfortunately, these idylls sometimes saw drama.
In October 2024, an American teenager committed suicide for love for a chatbot in the character. The latter had taken the identity of Daenerys Targaryen, one of the main characters in the book series The Iron Throne.
Chatgpt Love dependNC is a complicated phenomenon to analyze and supervise for OpenAI. However, the American company is interested in the problem as evidenced by a blog article published at the beginning of the month.
“Our goal is that the personality of Catgpt by default is warm, attentive and useful without trying to create emotional links with the user or continue their own agenda,” Joanne Jeng wrote, responsible for the behavior and policy of the models in Openai.
Source: BFM TV
