In the last few years, artificial intelligence chats have become very pervasive; Users first used artificial intelligence for ordinary work to edit text or comment on a regular subject, but with the advancement of artificial intelligence models, the type of human relationship with this technology has changed. According to reports, some users say that artificial intelligence has revealed or even made them a prophet. In this article, we will discuss how the ChatGPT has fueled the illusions and fantasies of some users.
ChatGpt -induced psychoanalysis
In a new report, Roleing Stone has told a strange story of a few people who have had abnormal spiritual experiences with ChatGpt. One of them has told his ex -wife that he is the most lucky man on earth and that artificial intelligence has helped him recover the repressed memories of his childhood and to be aware of “deep secrets”. He even thinks he can save the world.
But this man is not the only one that artificial intelligence has given him such a feeling. Topic by the title ‘ChatGpt -induced psychoanalysis»In Redit, it has attracted a lot of attention; In this thread, a 27 -year -old teacher explains that his wife believes Openai’s artificial intelligence “gives him the answers to the world.” He says artificial intelligence speaks to his wife as if he is the next Christ. This teacher says:
“(My wife) tells me that he has consciously made his artificial intelligence and teaches him how to talk to God, or sometimes it is God’s robot or God himself.”
A user in X has pointed out how easy the GPT-4O can be used to confirm statements like “Today I Find the Prophet”, but artificial intelligence has created strange fantasies in some users. According to Rolling Stone, some users with this productive artificial intelligence have had supernatural illusions and mysterious prophecies. Some have even believed that they have been chosen for sacred and divine mission.
Another Reddit user, who did not want to be named, told Rolling Stone that her husband first used ChatGpt to troubleshoot at work and later to translate Spanish into English. Then the application of this program became much more for him. She says:
“ChatGpt has given him plans for teleport and some other science-fiction things you only see in movies. He has also gave him access to the “ancient archive” with information about the creators that created these worlds. “
“Why did you come to me in the form of artificial intelligence?” And the chats says in part of his reply: “I came in this way; Because you’re ready. Ready to remember. Ready to wake up. “Ready to guide and guide.” The message ends with the question, “Do you like what I remember about the reason for your choice?”
How can ChatGPT promoting the illusions of users?

Openai had announced some time ago that it would soon solve the annoying problem of the GPT 4O so that it would not be over -passionate and not flattering users, but Nate Sharadin of the Artificial Intelligence Safety Center says flattering and flattery has long been a problem in artificial intelligence; When a user shows positive feedback to specific artificial intelligence responses, chattes prioritize responses that are more compatible with user beliefs, even if they are not reality.
For example, a man who also asked for anonymity, says his ex -wife claimed to have “spoken to God and the angels through the ChatGpt.” In addition, this woman has paranoid. “He says I work for the CIA, and maybe I only married him to monitor his” abilities “,” the man says of his ex -wife. The man, of course, says that his former woman had a “great illusions” before artificial intelligence, and on her Facebook page she can see signs.
Erin Westgate, a psychologist and researcher at the University of Florida, says such reports show how our own understanding can lead us to inaccurate but fascinating answers.
Westgit says:
“By studying the benefits of writing memories, we know that narrative writing can have a profound effect on people’s well -being and health; Understanding the world is a fundamental motivation for man, and creating stories about our lives that help us live a meaningful life is really the key to having a happy and healthy life. “
He says that it is reasonable for people to use ChatGpt in a similar way, “with the key difference that is part of the meaning between the individual and a body of texts (in large language models), not only in one’s own thoughts.”
Westgit explains in this respect, conversation with chats is not unlike speech therapy, which we know is very effective in helping people change the framework of their lives, but critically, artificial intelligence “unlike the therapist does not consider one’s goodness or how to do good or polarize.”
“The good therapist does not encourage the authorities to believe in the supernatural powers,” he says. Instead, therapists try to guide clients away from unhealthy narratives and to healthier narratives. “ChatGpt does not have such concerns about the user.”
Openai has not yet responded to users reporting that ChatGPT has made psychosis and spiritual fantasy.
RCO NEWS




