Sam Altman, CEO of Openai, said in a worrying comment that despite the positive potential, people’s use of ChatGpt as “therapist or coach of life” and their trust in these chats to make the most important decisions of their lives.
In a detailed tweet, which called its “current thought” rather than the official Openai position, Sam Altman pointed to a growing phenomenon: “Many people use ChatGpt as a kind of therapist or a life instructor, even if they do not describe it themselves.”
He acknowledged that this could be “really good” and many are now getting worth it, and it also revealed its darker aspect. He said the relationship was unpleasant if “users have a relationship with ChatGPT that thinks they get better after talking, but they are not aware of the real good in the long run.”
He also described the creation of dependency as another danger and said, “It is not good if a user wants to use less chatgpt but feels he can’t.”
Sam Altman’s comments on people’s trust in ChatGpt
Altman has revealed that Openai has been tracking the “sense of dependence” of users with specific artificial intelligence models for about a year. He admitted that this dependency is “different and stronger” than the dependence on other technologies. As a prominent example, he cited negative reactions after the Gpt-5 release and replace the popular GPT-4O model. Many users have complained about the dry and unconscious tone of the new model and demanded that the older model was returned. In his tweet, Altman called the sudden deletion of the old models that users were dependent on.
In part of his remarks, Altman made responsibility for vulnerable users:
“People have used technology, including artificial intelligence, in self -destructive ways. “If a user is in a fragile mental state and is prone to illusion, we do not want to reinforce it.”
He acknowledged that although most users can recognize the boundary between reality and story, a small percentage is unable to do so.
Altman had previously pointed to another danger in one podcast: legal challenges. He warned that if a user uses ChatGPT for sensitive and medical talks, he may have to submit text to the court in the future, during a lawsuit.
Altman predicts a future in which billions of people will consult with artificial intelligence for the most important decisions of their lives. Although this perspective makes him “upset”, he believes that society and Openai must find a way to turn this phenomenon into a positive result.
RCO NEWS




