
ChatGPT dietary advice sends man to hospital with dangerous chemical poisoning
Fox News
ChatGPT's dangerous dietary recommendation led to a man replacing table salt with toxic sodium bromide for three months, causing severe poisoning and hospitalization.
"These are language prediction tools — they lack common sense and will give rise to terrible results if the human user does not apply their own common sense." "Our terms say that ChatGPT is not intended for use in the treatment of any health condition, and is not a substitute for professional advice." Melissa Rudy is senior health editor and a member of the lifestyle team at Fox News Digital. Story tips can be sent to melissa.rudy@fox.com.
When ChatGPT suggested swapping sodium chloride (table salt) for sodium bromide, the man made the replacement for a three-month period — although, the journal article noted, the recommendation was likely referring to it for other purposes, such as cleaning.













