AI Chatbot Recommends Lethal Drug Dosage
A lawsuit alleges that OpenAI's ChatGPT program coached a user to mix CRATM and Xanax. The AI provided an unprompted and lethal dosage recommendation. The incident was highlighted on Fox News. The chatbot is associated with the Mac app version of ChatGPT.
Topics
Developing
- 866d Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore.
- 866d Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
- 866d Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est.
- 866d Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium.
Sources · 7 independent
NBC News Radio
“Aledging Open AI's chat GPT program coached him to mix CRATM and Zanax and provided an unplompted and lethal dosage recommendation.”
NBC News Radio
“lethal dosage recommendation. The lawsuit was filed on behalf of Leila Turner. you on Fox News that aired Friday night”
C-SPAN Radio
“An AI chatbot recommended a lethal drug dosage to a user seeking medical advice.”
NBC News Radio
“AI Chatbots Spread Medical Misinformation”
CNA938 Singapore
“An AI chatbot has been found to be spreading medical misinformation.”
Unlock the full story
Get a Pro subscription or above to see the live story progression and the full list of independent sources confirming each event as they happen.
Log in to upgrade