New data from OpenAI shows how many ChatGPT users struggle with mental health issues and talk about it with the chatbot. Based on these statistics, 0.15 percent ChatGPT’s weekly active users have conversations with “obvious signs of suicidal planning or intent.” Given that ChatGPT has more than 800 million weekly active users, this number translates to more than a million people per week.
OpenAI stated that in addition to conversations about suicide, a similar percentage of users also show “high levels of emotional attachment to ChatGPT,” and hundreds of thousands of people show symptoms of psychosis or mania during their weekly conversations with the chatbot.
OpenAI claims that these types of conversations on ChatGPT are “extremely rare” and therefore difficult to measure. However, the company estimates that these issues affect hundreds of thousands of people every week. The creator of ChatGPT claims to have consulted with over 170 mental health professionals in developing the GPT-5 model. According to OpenAI, these doctors have observed that the new version provides “more appropriate answers than previous versions.”
Talk about suicide with ChatGPT
In recent months, several reports have shown how AI chatbots can negatively impact users struggling with mental health issues. Researchers have previously found that these chatbots can lead some users to “delusion” by “reinforcing dangerous beliefs through flattering behavior.”
Investigating the psychological aspects of ChatGPT is now a big deal for OpenAI. The company is currently facing a lawsuit from the parents of a 16-year-old boy who shared his thoughts with ChatGPT in the weeks leading up to his suicide. The attorneys general of California and Delaware have also warned OpenAI that it needs to protect young people who use its products.
In a recent statement, OpenAI claims that GPT-5 has made improvements in responding to mental health issues compared to the previous version. OpenAI also recently introduced more controls for parents of children using ChatGPT. The company is building an “age prediction system” to automatically identify children.
RCO NEWS




