A US medical journal has warned against using ChatGPT for health information after a man developed a rare condition following an interaction with the chatbot about removing table salt from his diet.

An article in the Annals of Internal Medicine reported a case in which a 60-year-old man developed bromism, also known as bromide toxicity, after consulting ChatGPT.

The article described bromism as a “well-recognised” syndrome in the early 20th century that was thought to have contributed to almost one in 10 psychiatric admissions at the time.

Man eliminated salt from his diet

The patient told doctors that after reading about the negative effects of sodium chloride, or table salt, he consulted ChatGPT about eliminating chloride from his diet and started taking sodium bromide over a three-month period. This was despite reading that “chloride can be swapped with bromide, though likely for other purposes, such as cleaning”. 

The article’s authors, from the University of Washington, said the case highlighted “how the use of artificial intelligence can potentially contribute to the development of preventable adverse health outcomes”.

When they consulted ChatGPT themselves about what chloride could be replaced with, the response also included bromide, did not provide a specific health warning and did not ask why the authors were seeking such information — “as we presume a medical professional would do”, they wrote.

‘AI chatbots could fuel misinformation’ 

The authors warned that ChatGPT and other AI apps could ‘“generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation”.

The company announced an upgrade of the chatbot last week and claimed one of its biggest strengths was in health. 

It said ChatGPT — now powered by the GPT-5 model — would be better at answering health-related questions and would also be more proactive at “flagging potential concerns”, such as serious physical or mental illness.

However, it emphasised that the chatbot was not a replacement for professional help. 

The journal article, which was published last week before the launch of GPT-5, said the patient appeared to have used an earlier version of ChatGPT.

The authors said the bromism patient presented himself at a hospital and claimed his neighbour might be poisoning him. He also said he had multiple dietary restrictions. Despite being thirsty, he was noted as being paranoid about the water he was offered.

He tried to escape the hospital within 24 hours of being admitted and, after being sectioned, was treated for psychosis. Once the patient stabilised, he reported having several other symptoms that indicated bromism, such as facial acne, excessive thirst and insomnia.