Begin typing your search above and press return to search.
exit_to_app
DEEP READ
Ukraine
access_time 2023-08-16T11:16:47+05:30
Espionage in the UK
access_time 2025-06-13T22:20:13+05:30
Yet another air tragedy
access_time 2025-06-13T09:45:02+05:30
The Russian plan: Invade Japan and South Korea
access_time 2025-01-16T15:32:24+05:30
exit_to_app
Homechevron_rightTechnologychevron_rightMan in US suffers...

Man in US suffers life-threatening poisoning after following ChatGPT diet advice

text_fields
bookmark_border
Man in US suffers life-threatening poisoning after following ChatGPT diet advice
cancel

New Delhi: A man in the United States developed life-threatening bromide poisoning after following diet advice from ChatGPT, in what doctors believe could be the first known case of AI-linked bromism, Gizmodo reported.

The case, documented by University of Washington physicians in the journal Annals of Internal Medicine: Clinical Cases, revealed that the man consumed sodium bromide for three months, believing it was a safe substitute for chloride. The suggestion reportedly came from ChatGPT, which did not caution him about the dangers.

Bromide compounds, once used in medicines for anxiety and insomnia, were banned decades ago due to severe health risks. Today, they are mostly found in veterinary drugs and industrial products, with human cases of bromism being extremely rare.

The man first sought medical help believing his neighbor was poisoning him. Although some vitals appeared normal, he displayed paranoia, refused water despite thirst, and experienced hallucinations. His condition escalated into a psychotic episode, leading to his involuntary psychiatric admission.

He improved after receiving intravenous fluids and antipsychotic medication. Once stable, he told doctors that ChatGPT had suggested bromide as an alternative to table salt.

Although the original chat records were unavailable, doctors later asked ChatGPT the same question and found it again suggested bromide without warning about its toxicity.

Experts say the case highlights the risks of relying on AI for health advice, as it can provide information without adequate context or safety warnings. The man fully recovered after three weeks in hospital and was reported to be in good health during a follow-up visit.

Show Full Article
TAGS:ChatGPT diet advice poisoning 
Next Story