HomeTechnology"Ethically and Deontologically Disgusting": Can ChatGPT Become Your New Psychiatrist?

“Ethically and Deontologically Disgusting”: Can ChatGPT Become Your New Psychiatrist?

The viral chatbot can now replace many professions, more or less effectively. From there to replace psychologists?

Few themes resist ChatGPT. The OpenAI chatbot now solves math problems, philosophy topics (to the dismay of the teachers) and write fiery poems. But can artificial intelligence provide therapeutic advice and surpass the expertise of health professionals?

An American start-up wanted to experiment. Koko, an online platform that helps people with mental health problems, replaced their volunteers with ChatGPT and their automatically generated responses. All without warning patients who suffer from some depression. The company quickly shut down the experiment, following the controversy it generated.

“It is very disconcerting from an ethical and deontological point of view. We play with the issue of trust, with medical secrecy, when we are there to listen to them. People have had to face a void with this moment when they realize that nobody really listens to them, protests Olivier Duris, a psychologist specializing in the use of screens and digital technology. Depressed people are already alone and have this feeling of being abandoned. You’re in the reproduction of the thing. They have to think that even the people who are supposed to help them don’t care and would rather put an artificial intelligence to talk to them.”

“Follow-up with ChatGPT cannot be therapeutic”

Because for psychic care to bring something positive to a patient, it cannot be based “on the sum of the theoretical knowledge acquired during the psychologist’s five years of studies”, specifies Jean-Paul Santoro, digital clinical psychologist and co-founder. from the Psycheclic website. “What makes a follow-up therapeutic, and therefore alleviate the patient’s symptoms and alleviate her suffering, is a clinical experience, an encounter,” according to the doctor.

For his part, Olivier Duris understands that ChatGPT can give the illusion of being a useful tool, and this only with a very small detail: the way in which the chatbot writes its messages. “The fact that it delivers messages not in a block, but in the same way as if someone were typing, reinforces the impression that someone is behind the screen. It’s pretty twisted.”

A help, but not a solution.

ChatGPT, however, clarifies when asked for advice related to mental health that it is necessary to consult a specialist. It still offers some tips for managing stress, an argument with a loved one, or a feeling of discomfort. But each post ends with the same advice: “It is important to see a qualified mental health professional and do not hesitate to consult several to find the one that best suits your needs.”

Jean-Paul Santoro believes, however, that tools using artificial intelligence can be recommended in therapy because “they are relevant to supporting the patient at specific points between sessions.” It still needs to be sophisticated and advanced enough in terms of a conversational system to be effective. The psychologist gives the “My Sherpa” application as an example, a tool specialized in mental health but which, according to him, would be more “a concatenation of tools with a therapeutic effect close to zero.”

However, ChatGPT, by not providing real therapeutic advice, can help establish frameworks adapted to therapy. Olivier Duris, for example, plans to organize therapeutic mediation workshops for people with autism. So he asked ChatGPT what he should think about to prepare for this type of event. “The responses were quite interesting. But we are not in direct patient help, we are in technical points. A bit like looking for clues in books or articles.”

Long before ChatGPT: ELIZA

However, it is not the first time that a computer program tends to replace psychologists. In the 1960s, German-American computer scientist Joseph Weizenbaum developed ELIZA, an artificial intelligence that simulated the role of a psychotherapist. However, ELIZA was less developed than ChatGPT, since the latter only repeated the patient’s statements using keywords.

If a patient shared his problems with his mother, ELIZA would ask, “Can you tell me about your family?” or if she said that she wasn’t feeling well, the show would rephrase that opinion, simply asking “Why aren’t you okay?” A kind of verbal ping-pong that seduced users, to the point that the term “ELIZA effect” was created in view of the feelings of certain patients. This unexpected effect consists of unconsciously assimilating the behavior of a computer to that of a human being.

The psychologist especially invites schools and public health organizations to do prevention from an early age so that children (and the not so young) learn to differentiate between machines and humans so as not to “fall into that illusion that the machine understands us”. .

Author: julie ragot
Source: BFM TV

Stay Connected
16,985FansLike
2,458FollowersFollow
61,453SubscribersSubscribe
Must Read
Related News

LEAVE A REPLY

Please enter your comment!
Please enter your name here