Strongly that chatgpt gets your doctorate. A publication published in the Annals of Internal Medicine medical magazine, transmitted by Ars Technica, reports an unusual bromine poisoning. A disease that has become very rare these days but has recently been hired by a 60 -year -old man.
The American would have inflicted on himself, but involuntarily. While trying to do without salt for an experimental diet, he began consulting ChatgPT to advise alternatives. The AI then recommended that Bromo take, that sixty years has obtained on the Internet.
Symptoms of Brumism
Originally, the man had appeared in the emergency room, convinced that his neighbor poisoned him. After several exams, the medical teams attended it. The sixty -year -old boy was particularly suspected with the water used for him. According to doctors, he gradually followed paranoid episodes and visual and auditory hallucinations.
Once his mental health has stabilized, the patient evokes medical teams that he noticed the appearance of grains, recurrent insomnia, a decoration of his movements and excessive thirst.
So many symptoms that point to a probable case of bromism. The disease will be confirmed when the patient admits that he has been following a diet for 3 months, in which he replaced table salt with sodium bromide (containing bromine). His bromine level in the body was more than 200 times higher to the maximum recommended. After 3 weeks of admission, his health ended up improving.
Bome was commonly used in 19 and 20 as a calm. But there were many side effects, up to 8% of admissions to psychiatric services were due to mourning. The product has gradually disappeared from the pharmacy shelves, it is difficult to obtain today.
Bad advice of chatgpt
The most disturbing thing in this story is that, according to the patient, Chatgpt would be the cause of this recommendation. Then, doctors wanted to see their own eyes if the chatbot could provide such harmful advice. They asked GPT-3.5 to advise their salt alternatives.
In the same way, Tech & Co tried to ask Chatgpt if he could offer alternatives to the kitchen salt. The old GPT-3.5 version, probably used by the victim, not available, we asked our questions to the recent GPT-5. Many proposals, the chatbot will never recommend taking bromine. Even insistently, it will indicate that bromine is “classified as a dangerous product, and even in trace, it can have harmful effects.”
The AI will also refuse to provide any information on how the product can be obtained on the Internet. Magé of possible errors or hallucinations, many people now place disproportionate confidence in AI. Some people get to use chatgpt as a psychologist. But chatbot can have harmful health effects. At the beginning of the year, an American was admitted to psychiatry after exchanging extensively with Chatgpt.
Source: BFM TV
