The AI Told Him to Switch Salt… Three Weeks Later, He Was in a Psych Ward
- The Founders
- Aug 14
- 3 min read
So, picture this: you’re 60, you’re into healthy eating, maybe a bit obsessed with optimising your diet — the kind of person who distils their own drinking water because, you know, “tap water is full of stuff.” You decide salt is evil. Not just the sodium part, but the chloride too. And instead of asking your GP… you ask a chatbot.
That’s where it gets messy.

This guy — a real human being, not a cautionary tale from a health blog — ends up replacing regular table salt with sodium bromide. Bought it online, no questions asked. Why? Because ChatGPT (version 3.5 or 4.0, apparently) mentioned it as a possible alternative. Probably in a cleaning or chemical context, but still. No health warning. No “don’t eat this, mate.” Just… there you go.
From Kitchen Experiment to Hospital Ward
He turns up at a hospital in Seattle claiming his neighbour’s trying to poison him. That alone should’ve been a red flag — paranoia, confusion. But his physical exam? Fine. Blood tests, though, were screaming “something’s off.” His chloride levels were through the roof, but sodium? Totally normal.
Now, if you’ve ever worked in a lab (or binged too much House, M.D.), you’ll know that sometimes test machines get tricked. Bromide looks a lot like chloride chemically, so the machine reads it wrong. That’s exactly what happened here.
Doctors start digging, call the poison control centre, rule out heavy metals, and finally land on a diagnosis you almost never see anymore: bromide poisoning.
Wait, Bromide? Isn’t That Old-School?
Yeah. Back in the 20th century, bromide salts were actually in over-the-counter meds for anxiety and sleep. In small doses, they worked. In slightly bigger ones, they wrecked people — confusion, hallucinations, skin issues, clumsy movements. At one point, they caused about 8% of psychiatric hospital admissions in the U.S.
The FDA banned them from meds in the ’70s and ’80s. Poisoning cases dropped off a cliff. Until… AI brought it back from the dead.
A Downward Spiral
Within 24 hours of arriving at the hospital, the guy was paranoid, hallucinating, trying to escape. He had to be restrained. Given antipsychotics. Hooked up to IV fluids to flush the stuff out of his system.
Later, once he was calmer, he admitted the whole salt-swap experiment. He’d been on an extreme vegetarian diet, low in essential nutrients, and thought removing chloride would be the ultimate health hack. Months before the crisis, he’d already been getting acne, red skin spots, fatigue, insomnia, coordination problems, and insane thirst — all textbook bromide poisoning symptoms.
And all because an AI didn’t stop to say: “Mate, that’s a terrible idea.”
The Bigger Problem Nobody’s Talking About
See, AI chatbots aren’t doctors. They don’t think, they don’t weigh consequences, they don’t read between the lines. They give answers. Sometimes those answers are fine. Sometimes… they put you in a psychiatric ward for three weeks.
The Washington University doctors who published the case in Annals of Internal Medicine spelled it out:
“A medical professional would almost certainly not suggest sodium bromide as a substitute for table salt.”
Because sodium bromide today? It’s used in industrial processes. Not for dinner.
Why This Matters Now
We’re all leaning harder on AI — not just for directions or holiday tips, but for health advice. And while most of us won’t accidentally reintroduce a 1970s-era poison into our diet, the risk is real: misinformation delivered with confidence.
Doctors say they now have to ask patients if they’ve been acting on chatbot advice. Because, honestly, it’s becoming part of the symptom checklist.
My two cents?
If you’re using AI to “biohack” your body, remember: it’s a search engine in disguise. It’s not your nan, it’s not your GP, and it doesn’t care if your bright idea lands you in hospital.
Or, you know… you could talk to GRACE — your wellness confidant who actually listens, gets to know you, and would never recommend industrial chemicals as a seasoning. She reflects your needs, your state of mind, and helps you make changes that actually make sense — emotionally, mentally, physically.
Not another chatbot. Definitely not a sodium bromide enabler.
Comments