This site has limited support for your browser. We recommend switching to Edge, Chrome, Safari, or Firefox.

Sign up for our Club 24 and receive 10% off your first purchase

Cart 0

Congratulations! Your order qualifies for free shipping You are $100 away from free shipping.
No more products available for purchase

Products
Pair with
Is this a gift?
Subtotal Free
Shipping, taxes, and discount codes are calculated at checkout

Your Cart is Empty

Cautionary Tale About Getting Dietary Advice from ChatGPT

Cautionary Tale About Getting Dietary Advice from ChatGPT

After Psychosis & Bromide Poisoning, 60-Year-Old Spends Three Weeks in Hospital

After learning about the negative effects of sodium chloride (table salt) on health, a 60-year-old man was inspired to eliminate chloride completely from his diet. He consulted with ChatGPT, which told him that sodium bromide could replace sodium chloride. This was likely “discussed with ChaptGPT” without proper context, as bromide can indeed replace chloride for, say, sanitizing pools and spas…. but not in your diet.

After three months of this personal experiment, the man took himself to the emergency room, where he expressed concern that his neighbor had poisoned him (the man had no prior history of psychiatric issues…or medical issues in general!).

Within the first 24 hours at the ER, his paranoia intensified with auditory and visual hallucinations, and he was placed on an involuntary psychiatric hold as a danger to himself and to others.

After initial blood testing and consultation with the Poison Control Department, he was tested further, showing markedly elevated bromide levels: 1,700 mg/L versus the normal range of 0.9-7.3 mg/L.

The man was treated with risperidone for psychosis, and, gradually, over the course of a three-week hospital stay, his bromide and chloride levels normalized, and his psychotic symptoms improved. He was tapered off risperidone prior to his discharge and, at his follow-up two weeks later, he was stable without medication.

But, what a ride huh?

Discussion

This case highlights how AI-derived guidance can contribute to preventable health disasters. 

Although the man’s conversation log with ChapGPT was unavailable, clinicians later asked ChatGPT-3.5 about substitutes for chloride; indeed, the response listed bromide, but it did not provide a clear health warning or solicit the purpose….as a proper advisor would/should.

The lesson here:  Take AI advice with a huge grain of sodium chloride! 

This story was translated and adapted from Coliquio (9/16/2025) and Medscape/Benjamin Burgard (10/8/2025)