Collector
Warning issued for people using AI chatbots for medical advice: Major study found information given by ChatGPT, Gemini and Grok is often inaccurate | Collector
Warning issued for people using AI chatbots for medical advice: Major study found information given by ChatGPT, Gemini and Grok is often inaccurate
Daily Mail UK

Warning issued for people using AI chatbots for medical advice: Major study found information given by ChatGPT, Gemini and Grok is often inaccurate

AI chatbots consistently give 'highly' problematic medical advice, with few caveats or disclaimers, that could present substantial risk to users, experts have warned.

Go to News Site