Run at the end of 2022, ChatGPT is increasingly popular, which approaches 700 million daily active users. If your creator, OpenAi, is delighted, also cares about the mental and emotional impact that your chatbot can have on users.
That is why he seeks to improve his chatbot, so that he can “detect better signs of mental or emotional anguish”, and therefore “react properly and guide people to resources based on convincing data if necessary.”
Better helps users in mental or emotional anguish
This announcement occurs when users have crossed mental health crises in situations in which Chatgpt has amplified their delusions. In July, an American computer scientist was, for example, hospitalized in psychiatry after exchanging with chatbot. While required the handling, the conversational robot deterred it, aggravating its symptoms.
Chatgpt also encouraged him frantically in his theories, convincing him that he had found the solution to travel at the speed of light. In his blog article, Operai has also recognized that one of his models, 4th, does not always “recognize the signs of delirium or emotional dependence.”
In addition to allowing its models and tools to detect these signs better, the company also works with experts to improve the way its chatbot reacts currently.
Pause
Operai also seeks to improve Chatgpt responses to users who use it to solve their personal problems. If a person asks, for example, if he has to break with his boyfriend, the conversational agent should not give him an answer, but to help him think by encouraging him to ask questions or weigh the pros and cons.
Soon this new behavior will be implemented for “personal decisions with high challenges,” said the company. Finally, Operai, like social networks, will begin to send “discreet reminders” to users to encourage them to take a break during long -use sessions. Therefore, the company intends to help users maintain control of their time when they interact with chatgpt, while supporting them when they meet personal difficulties, but without making decisions instead.
Source: BFM TV
