California enacted legislation this Monday, October 13, that regulates artificial intelligence (AI) conversational agents, or “chatbots,” for the first time in the United States, following a series of suicides among teenagers who had formed fictitious intimate relationships with these tools, Governor Gavin Newsom announced.
Defying pressure from the White House, which opposes AI regulation, the Democratic governor signed a series of laws on Monday that require, among other things, verifying the age of users, displaying periodic warning messages and providing for suicide prevention protocols.
One of the main texts, the law SB243, refers to the regulation of chatbots that can play a role as a companion or confidant, such as those developed by platforms such as Replika or Character.AI.
The latter was one of the first sued by the parents of a 14-year-old American teenager, Sewell, who committed suicide in 2024 after having entered into a romantic relationship with a chatbot inspired by the series “Game of Thrones” and suspected of having reinforced his suicidal thoughts.
“Railings”
“We can continue to be leaders in the field of AI (…) but we must act responsibly, protecting our children along the way,” added the governor of California, where Silicon Valley and the main giants in the development of AI models such as OpenAI (ChatGPT), Google (Gemini) or xAI (Grok) are located.
In addition to age verification, this law requires periodic messages to be displayed to remind the user that the person they are talking to is a machine (every 3 hours for minors). It requires companies to provide screening for suicidal thoughts, provide access to prevention programs, and report statistics on the topic to authorities.
“Emerging technologies like chatbots and social media can inspire, educate and connect people, but without real safeguards, they can also exploit, divert and endanger our children,” the governor further justified. In the United States, there are no national rules to limit the risks associated with AI, and the White House is trying to prevent states from producing their own rules.
3114: the national suicide prevention number
3114 is the number to call if you have psychological problems or if you have suicidal thoughts. You can also contact him if any of your loved ones are in this situation. It offers consultations with specially trained health professionals and is available seven days a week, 24 hours a day.
Source: BFM TV
