Sam Altman, CEO of Openai, has warned users in a new ierview that the artificial ielligence industry is still a way for Protection Complete of Privacy They have not found in sensitive conversations. According to him, there is nothing called a physician and the patie when you use chats such as ChatGPT to treat or receive emotional support.
The Openai CEO has made his statemes in the new podcast “This Past Weekend”. In response to a question about artificial ielligence ieraction with curre legal systems, Altman says one of the main problems was not a legal framework or a specific policy for artificial ielligence, that for users’ conversations with the technology, Legal confideiality There is no.
ChatGPT weakness when used as therapist
She says:
“People share the most private issues of their lives with ChatGpt. People, and especially young people, use it as a therapist and a life trainer and raise their emotional problems. But if we talk to a therapist, lawyer or medical (real), your conversation is legally protected. But we still don’t have such a framework for ChatGpt. “
According to Altman, if a legal complai is filed, this could become a serious concern for users’ privacy, as Openai is currely legally obliged to prese the coe of these conversations if needed.

He said in this part of his remarks:
“If a user talks to ChatGpt about a very sensitive subject and then gets involved in a lawsuit, we may be legally obliged to provide that information. In my opinion, these conditions are worrying. User conversations with chats must have the same privacy level as expected to ieract with a physician or therapist. “
For example, Openai is currely in the New York Times case, forcing the company to store hundreds of millions of ChatGPT users around the world (except users Eerprise). Openai said in a stateme on its website that he has objected to the ruling and called it “exaggeration”.



