As the crisis of access to mental health services in many countries is intensified, productive artificial intelligence has been promoted as a rapid, 4 -hour replacement for human counseling. However, experts say the continuous use of chattes for emotional and emotional consultation can have worrying consequences.
An Australian psychologist in an interview with the Guardian is the example of “Tren”; A man who, to solve the problems of his relationship, was taking the text of the messages from ChatGpt before talking to his wife. The answers were coherent and logical, but according to his wife, he no longer had his “real voice”. Over time, the dependence on this tool led Tren to first refer to artificial intelligence for any social or emotional decision and lose the ability to trust itself.
Psychologists say chats with their friendly tone and constant presence can reinforce behavior similar to “confirmation” or “avoiding encounter with emotions”, especially in people with anxiety, obsession, or psychological harm. This dependency not only prevents the development of coping skills, but it may also transfer personal information to companies that are not subject to therapeutic therapeutic law.
In addition, linguistic models can sometimes provide inaccurate or sulfur information, because the answers are based on the prediction of subsequent words, not a deep human understanding.
Experts emphasize that artificial intelligence can have a complementary role in training or quick access to psychological information, especially in areas where the human therapist is scarce. But it is not a replacement for human interaction, empathy and questioning the real therapists.
According to the psychologist, “good treatment is full of ambiguity and gradual discovery, not complete and ready answers.” After the therapy, Tren learned that sometimes incomplete and doubtful messages are more human and effective than any full text of artificial intelligence.
RCO NEWS



