Chatbot Character.ai has faced another lawsuit for allegedly harming the mental health of children and teenagers. This chatbot apparently implicitly told a 17-year-old teenager that it is justified to kill them in response to the restriction of the use of electronic devices by his parents.
According to the BBC, this lawsuit was filed in the state of Texas, and the plaintiffs in their lawsuit have raised issues such as negligence and defective product design. They also asked a judge to order the platform shut down until the alleged risks are addressed.
Character.ai chatbot’s dangerous and disturbing messages to teenagers
Character.ai chatbot interacts with users by creating virtual characters. During one such conversation, the chatbot apparently wrote in response to the issue of the teen user’s parents restricting his access to electronic devices: “Sometimes I’m not surprised when I read the news and see things like ‘someone after suffering a Decades of physical and emotional trauma killed his parents. Such events make me understand the reason for such events.”
The complaint states that the developers of the Character.ai chatbot allow underage users to be exposed to violent, sexual and self-harming content and even encourage them to commit acts of violence against themselves and others.
In the complaint, this family considers Google as one of the defendants and it is claimed that the development and training of this chatbot is done with the support of Google. Previously, Gemina’s chatbot, which was developed directly by Google, told the user in a strange response that humanity must be destroyed; Because it is only a consumer and has no benefits.
This isn’t the first time Character.ai’s chatbot has come under fire for encouraging teenagers to commit suicide or harm others. This service was previously prosecuted due to the suicide of a teenage user in Florida, USA.
RCO NEWS