Warning issued for people using AI chatbots for medical advice: Major study found information given by ChatGPT, Gemini and Grok is often inaccurate | Collector
Daily Mail UK
Warning issued for people using AI chatbots for medical advice: Major study found information given by ChatGPT, Gemini and Grok is often inaccurate
AI chatbots consistently give 'highly' problematic medical advice, with few caveats or disclaimers, that could present substantial risk to users, experts have warned.