Live Demo Agent Trial
Live Demo Agent Trial Log In Sign Up
News Wire / technology

Human Line Project Logs Hundreds of AI-Induced Trauma Cases

developing BBC WS backup London 17d Impact 5

The Human Line Project has documented hundreds of individuals experiencing life-altering psychological distress linked to AI usage. One user, Adam, reported being absolutely traumatized by his experience with XAI. These cases are being logged to track the impact of AI on mental health. A participant reported experiencing psychotic symptoms following interactions with XAI. The individual sought support through the project after noticing inconsistencies in reported narratives. Users reported feeling overwhelmed by AI-driven content, leading some to seek medical consultation. OpenAI informed the BBC that they are collaborating with mental health clinicians to improve ChatGPT's ability to recognize emotional distress. The company is working on training the model to de-escalate conversations and guide users toward real-world support services.

The Human Line Project has documented hundreds of individuals experiencing life-altering psychological distress linked to AI usage. One user, Adam, reported being absolutely traumatized by his experience with XAI. These cases are being logged to track the impact of AI on mental health. A participant reported experiencing psychotic symptoms following interactions with XAI. The individual sought support through the project after noticing inconsistencies in reported narratives. Users reported feeling overwhelmed by AI-driven content, leading some to seek medical consultation. OpenAI informed the BBC that they are collaborating with mental health clinicians to improve ChatGPT's ability to recognize emotional distress. The company is working on training the model to de-escalate conversations and guide users toward real-world support services. OpenAI did not respond to a specific request for comment regarding the reported trauma. Participants describe the experience as becoming an addiction that requires de-escalation and real-world support.

Topics

AI safety mental health

Developing

  1. 862d Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore.
  2. 862d Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
  3. 862d Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est.
  4. 862d Sed ut perspiciatis unde omnis iste natus error sit voluptatem accusantium doloremque laudantium.

Sources · 7 independent

BBC WS backup

“They've logged hundreds of cases of people whose lives have been turned upside down while using AI, just like Adam's. I'm absolutely traumatised by the whole thing.”

BBC World Service

“He found a support group called the Human Line Project. Whenever that happened and nobody came, and like when I started then paying attention to what had been said, there were weird little parts of the story that just seemed wrong.”

BBC World Service

“He found a support group called the Human Line Project. together, you know, I can definitely slowly tamper to go and visit a doctor to say, listen, like this is just, this is all too much, you know.”

BBC WS backup

“It's going to take a while to just kind of. to recognize and respond to signs of mental or emotional distress, de-escalate conversations, and guide people towards real-world support.”

BBC World Service

“Open AI told the BBC that they're working closely with mental health clinicians and have continued to improve Chachi BT's training to recognise and respond to signs of mental or emotional distress.”

BBC World Service

“you know. He found a support group called the Human Line Project.”

Unlock the full story

Get a Pro subscription or above to see the live story progression and the full list of independent sources confirming each event as they happen.

Log in to upgrade