Zap // Phonnirl, sipotography / Depositphotos
The latest generations of artificial intelligence applications have proven to (almost always) be useful for restaurant recommendations and write emails, but as a source of medical advice often caused serious problems. This time, it went very badly.
A man who followed a chatbot health plan ended in the hospital after giving a rare form of toxicity.
The story started when the patient decided improve your health reducing its consumption of salt, or sodium chloride.
To find a substitute for the culinary additive, he did what many other people do today: asked the chatgpt.
OpenAi’s chatbot apparently suggested sodium bromidethat the man ordered online and incorporated into his diet.
Although it is true that sodium bromide can be a substitute for sodium chloride, this is true To clean your bathtub or retrete “Not to make your chips tastier.”
Mas A he Forgot to mention this little detail.
Three months later, the man performed at the emergency service of the local hospital with paranoid delírios, Believing that his neighbor was trying to poison him.
“In the first 24 hours of hospitalization, it expressed increasing paranoia and auditory and visual hallucinationswhich, after an attempted escape, resulted in their involuntary psychiatric retention for severe disability, ”doctors write in a published last week in the magazine Annals of Internal Medicine: Clinical Cases.
After being treated with antipsychotic medicationsthe man calmed enough to explain its dietary regime inspired by. This information, together with its exam results, allowed the medical team to diagnose it with Bromism – a toxic accumulation of bromide.
In most healthy individuals, bromide levels are usually lower than about 10 mg/l; In the case of this patient, his bromide levels were measured in 1.700 mg/Lexplains the.
Bromism was a condition relatively common in the early twentieth centuryand it is estimated that it was responsible for until 8 % of hospitalizations Psychiatric.
But the cases of this condition diminished dramatically in the 1970s and 1980, after the drugs with bromits gradually started to be eliminated.
After the diagnosis, the patient was treated over three weeks And was discharged without important problems.
The main concern in this case study is not so much the return of an old disease – is that the emerging AI technology is still far below to replace human expertise when it comes to things that really matter.
“It is important to consider that chatgpt and other AI systems can generate scientific inaccuraciesthey lack the ability to critically discuss results, and end up feed dissemination of misinformation“They write the authors of the article.
“It is highly unlikely that a medical expert had mentioned sodium bromide when confronted with a patient looking for a viable substitute for sodium chloride.”