Back to all posts

ChatGPT Diet Advice Induces Rare Psychiatric Condition

2025-08-10Pocharapon Neammanee3 minutes read
Ai
Health
Psychosis

An AI-Powered Diet With Dire Consequences

A startling case study published by the American College of Physicians Journals details how a 60-year-old man developed a rare psychiatric disorder after following diet advice from ChatGPT. The man, who had no prior history of mental illness, landed in the hospital with severe paranoia and hallucinations after making a dangerous substitution in his diet based on the AI's suggestions.

From Health Experiment to Medical Crisis

Inspired by nutrition courses he took in college, the man embarked on a personal experiment to completely eliminate sodium chloride, or common table salt, from his diet. Finding only information on how to reduce salt intake, he turned to ChatGPT for guidance on total elimination. Researchers noted that the AI suggested he could replace sodium chloride with sodium bromide. The man purchased sodium bromide from an online source and followed this advice for three months, even distilling his own water at home to avoid any trace of sodium chloride.

While reducing excessive sodium is often recommended, it remains a necessary nutrient for basic bodily functions. The AI's recommendation to swap it with a chemical compound proved to be a perilous one.

A man came down with a rare form of psychosis after ChatGPT gave him very bad dietary advice.

The Onset of Severe Psychosis

The man was eventually hospitalized after developing extreme paranoia, believing his neighbor was trying to poison him. Despite being very thirsty, he was suspicious of the water he was offered. His condition deteriorated rapidly. "In the first 24 hours of admission, he expressed increasing paranoia and auditory and visual hallucinations, which, after attempting to escape, resulted in an involuntary psychiatric hold for grave disability,” the study reported.

A Diagnosis from the Past: Bromide Toxicity

Doctors ultimately diagnosed the patient with bromism, or bromide toxicity. This condition is exceptionally rare today but was more prevalent in the early 20th century when bromide compounds were used in several over-the-counter sedatives. At its peak, bromism accounted for up to 8% of psychiatric admissions. After weeks of treatment for psychosis, the man was discharged.

The Perils of AI Medical Advice

This case serves as a stark warning about the potential dangers of using AI for medical tips. Experts caution that AI models, while powerful, can lack the nuanced understanding and context required for safe medical guidance. Dr. Margaret Lozovatsky, a pediatrician, told the American Medical Association that AI tools can misinterpret information. “Even if the source is appropriate, when some of these tools are trying to combine everything into a summary, it’s often missing context clues, meaning it might forget a negative,” she explained. “So, it might forget the word ‘not’ and give you the opposite advice.” This case tragically illustrates that exact risk, turning a health-conscious experiment into a life-threatening ordeal.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.