News
Bromism was once so common it was blamed for "up to 8% of psychiatric admissions" according to a recently published paper on ...
A new case warns that relying on AI for diet advice can be dangerous, as a man replaced salt with sodium bromide and ...
Many people have turned to platforms like ChatGPT in search of advice to help them navigate life. However, there are plenty of anecdotes that have shown ...
A 60-year-old man was hospitalized after following ChatGPT’s advice to remove salt from his diet and replace it with toxic ...
The patient, hospitalised for three weeks, sought AI advice on salt alternatives; the chatbot suggested bromide, which he ...
A 60-year-old man was hospitalized with toxicity and severe psychiatric symptoms after asking ChatGPT for tips on how to improve his diet.
Recently, an elderly man from New York relied on ChatGPT for a healthy diet plan, but ended up in the hospital with a rare poisoning. These cases raise serious concerns about relying on AI for medical ...
10hon MSN
A case report has described an incident in which a 60-year-old man seeking to make a dietary change consulted ChatGPT and ...
A man nearly poisoned himself after following ChatGPT’s advice to cut salt, using sodium bromide for 3 months from an online ...
A 60-year-old man who turned to ChatGPT for advice replaced salt from their diet and consumed a substance that gave them neuropsychotic illness called bromism.
The case involved a 60-year-old man who, after reading reports on the negative impact excessive amounts of sodium chloride (common table salt) can have on the ...
In an age where AI solutions are just a click away, a man's harrowing experience underscores the urgent need for discernment ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results