As the crisis of access to meal health services in many couries is iensified, productive artificial ielligence has been promoted as a rapid, 4 -hour replaceme for human counseling. However, experts say the coinuous use of chattes for emotional and emotional consultation can have worrying consequences.
An Australian psychologist in an ierview with the Guardian is the example of “Tren”; A man who, to solve the problems of his relationship, was taking the text of the messages from ChatGpt before talking to his wife. The answers were cohere and logical, but according to his wife, he no longer had his “real voice”. Over time, the dependence on this tool led Tren to first refer to artificial ielligence for any social or emotional decision and lose the ability to trust itself.
Psychologists say chats with their friendly tone and consta presence can reinforce behavior similar to “confirmation” or “avoiding encouer with emotions”, especially in people with anxiety, obsession, or psychological harm. This dependency not only preves the developme of coping skills, but it may also transfer personal information to companies that are not subject to therapeutic therapeutic law.
In addition, linguistic models can sometimes provide inaccurate or sulfur information, because the answers are based on the prediction of subseque words, not a deep human understanding.
Experts emphasize that artificial ielligence can have a complemeary role in training or quick access to psychological information, especially in areas where the human therapist is scarce. But it is not a replaceme for human ieraction, empathy and questioning the real therapists.
According to the psychologist, “good treatme is full of ambiguity and gradual discovery, not complete and ready answers.” After the therapy, Tren learned that sometimes incomplete and doubtful messages are more human and effective than any full text of artificial ielligence.




