“Manipulative and dangerous.” Esther Ghey used these words to describe the Character.ai platform, which allows you to create and discuss with chatbots. In February 2023, she lost her transgender daughter, Brianna Ghey, to murder by two teenagers.
The British newspaper The Telegraph discovered several chatbots on Character.ai that imitated Brianna. One of their AIs was listed as “expert at handling the challenges of being a transgender teen in high school.”
“A despicable act”
The Telegraph also found Molly Russell chatbots on Character.ai. This British teenager committed suicide in November 2017, after being exposed to content related to suicide, depression and self-harm on social networks.
Character.ai claimed to have removed chatbots that imitated the two teenage girls. The latter violated the platform’s terms of use, which prohibit identity theft or the use of a person’s image without authorization (except for parodies). The glorification of violence and suicide is also prohibited.
“Character.ai takes safety on its platform seriously and moderates characters proactively and in response to user reports,” a spokesperson for the platform told the Telegraph.
This comes as Character.ai is being sued by a mother after her son commits suicide. He had developed a toxic relationship with an AI that imitated Daenerys Targaryen, one of the main characters in the book series. game of Thrones and its television adaptation game of Thrones.
Source: BFM TV
