News
Hosted on MSN1m
Man Almost Poisons Himself Following ChatGPT's Advice Removing Salt from Diet; What is Bromism?So, when he used ChatGPT, the bot told him chloride could be swapped for bromide – and he changed salt in his diet with ...
Bromism was once so common it was blamed for "up to 8% of psychiatric admissions" according to a recently published paper on ...
A 60-year-old man was hospitalized with toxicity and severe psychiatric symptoms after asking ChatGPT for tips on how to improve his diet.
Many people have turned to platforms like ChatGPT in search of advice to help them navigate life. However, there are plenty of anecdotes that have shown ...
After the escape attempt, the man was given an involuntary psychiatric hold and an anti-psychosis drug. He was administered ...
A new case warns that relying on AI for diet advice can be dangerous, as a man replaced salt with sodium bromide and ...
1d
Futurism on MSNMan Follows ChatGPT's Advice and Poisons HimselfA man trying to cut out salt from his diet learned the hard way that ChatGPT isn't to be trusted with medical advice after ...
2don MSN
A 60-year-old man gave himself an uncommon psychiatric disorder after asking ChatGPT for diet advice in a case published ...
A 60-year-old man who turned to ChatGPT for advice replaced salt from their diet and consumed a substance that gave them neuropsychotic illness called bromism.
Recently, an elderly man from New York relied on ChatGPT for a healthy diet plan, but ended up in the hospital with a rare ...
A 60-year-old man was hospitalized after following ChatGPT’s advice to remove salt from his diet and replace it with toxic ...
In an age where AI solutions are just a click away, a man's harrowing experience underscores the urgent need for discernment ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results