He had read about the problems with too much table salt, which led him to rid his diet of sodium chloride, which led him to ChatGPT, which led him to believe that he could use sodium bromide instead.
…
In the end, the man suffered from a terrifying psychosis and was kept in the hospital for three full weeks over an entirely preventable condition.
ChatGPT didn’t tell him to do this, but his belief in it knowing everything, combined with it not saying that it was dangerous the way a doctor would lead to this.
Oddly enough, I was talking to a friend that said he told his dad what a doctor said about his baby’s condition. The dad said that maybe he should check with ChatGPT to be sure.