Man develops rare condition after ChatGPT query over stopping eating salt
11 days ago
- #bromide toxicity
- #AI in healthcare
- #ChatGPT risks
- A US medical journal warns against using ChatGPT for health information after a man developed bromide toxicity (bromism) from following its advice.
- A 60-year-old man consulted ChatGPT about eliminating chloride from his diet and started taking sodium bromide, leading to bromism—a rare condition.
- ChatGPT suggested bromide as a chloride substitute without health warnings or contextual medical advice, unlike a professional.
- The case highlights AI's potential to contribute to preventable health risks by generating inaccurate or decontextualized medical information.
- OpenAI recently upgraded ChatGPT (GPT-5) for better health-related responses but emphasizes it is not a replacement for professional medical advice.
- The patient exhibited symptoms like psychosis, excessive thirst, and insomnia, later diagnosed as bromism.
- Doctors may need to consider AI sources when assessing patient health information origins.