Back to all posts

ChatGPTs Dangerous Diet Tip Hospitalizes Man

2025-08-11Reda Wigle3 minutes read
AI Safety
Health
ChatGPT

Seeking medical advice from artificial intelligence can have devastating, and nearly deadly, consequences. This was a lesson learned the hard way by a 60-year-old man who was hospitalized with severe psychiatric and physical symptoms after asking ChatGPT for dietary advice.

His seemingly innocent quest for a healthier lifestyle resulted in a toxic reaction so severe that doctors were forced to place him on an involuntary psychiatric hold.

A Seemingly Harmless Question with Toxic Results

The incident began when the man, concerned about the adverse health effects of table salt (sodium chloride), turned to ChatGPT for a suitable replacement. The AI suggested he could swap it with sodium bromide. While it looks similar to table salt, sodium bromide is a profoundly different compound primarily used for industrial and cleaning purposes, not for human consumption.

Inspired by the AI's suggestion and drawing on his own college nutrition studies, the man purchased sodium bromide online and began a three-month experiment, completely replacing table salt in his diet with the industrial chemical.

Smartphone displaying the ChatGPT logo.

From Dietary Experiment to Psychiatric Hold

After three months on his new diet, the man was admitted to the hospital. He was suffering from intense paranoia, believing his neighbor was trying to poison him. He told doctors he was distilling his own water and complained of extreme thirst but was suspicious of any water offered to him.

Despite having no prior psychiatric history, his condition worsened within 24 hours. He developed severe paranoia and began experiencing both auditory and visual hallucinations. After an attempt to escape, he was treated with fluids, electrolytes, and antipsychotics before being admitted to the hospital's inpatient psychiatry unit.

The Diagnosis: Bromism and the Dangers of AI

Doctors ultimately diagnosed the man with bromism, a toxic syndrome caused by overexposure to bromide. The details of this harrowing case were published in a study in the Annals of Internal Medicine Clinical Cases. Once his condition improved, the man was able to report other symptoms consistent with bromide toxicity, including acne, fatigue, insomnia, and ataxia—a neurological condition causing a lack of muscle coordination.

Doctor holding the hand of a senior man in a hospital bed.

The Growing Trust in AI for Health Advice

The study's authors issued a stark warning: “It is important to consider that ChatGPT and other AI systems can generate scientific inaccuracies, lack the ability to critically discuss results, and ultimately fuel the spread of misinformation.”

OpenAI, the creator of ChatGPT, explicitly states in its terms of use that its service is not intended for diagnosing or treating health conditions. However, this disclaimer has not stopped a growing number of people from turning to AI for accessible healthcare information. A 2025 survey revealed that 35% of Americans are already using AI to manage their health. The same survey found that 63% of people find AI trustworthy for health guidance, a rate higher than social media (43%) but still far below doctors (93%).

A Stark Warning for the Future of AI in Healthcare

This incident is part of a troubling trend. Mental health experts have begun to raise alarms about a phenomenon called “AI psychosis,” where deep engagement with chatbots can lead to severe psychological distress. The report's authors concluded with a chilling reminder of the gap between artificial and human intelligence: “It is highly unlikely that a medical expert would have mentioned sodium bromide when faced with a patient looking for a viable substitute for sodium chloride.” This case serves as a powerful cautionary tale about the life-threatening risks of taking medical advice from a machine.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.