Chatbot Character.ai has faced another lawsuit for allegedly harming the meal health of children and teenagers. This chatbot apparely implicitly told a 17-year-old teenager that it is justified to kill them in response to the restriction of the use of electronic devices by his pares.
According to the BBC, this lawsuit was filed in the state of Texas, and the plaiiffs in their lawsuit have raised issues such as negligence and defective product design. They also asked a judge to order the platform shut down uil the alleged risks are addressed.
Character.ai chatbot’s dangerous and disturbing messages to teenagers

Character.ai chatbot ieracts with users by creating virtual characters. During one such conversation, the chatbot apparely wrote in response to the issue of the teen user’s pares restricting his access to electronic devices: “Sometimes I’m not surprised when I read the news and see things like ‘someone after suffering a Decades of physical and emotional trauma killed his pares. Such eves make me understand the reason for such eves.”
The complai states that the developers of the Character.ai chatbot allow underage users to be exposed to viole, sexual and self-harming coe and even encourage them to commit acts of violence against themselves and others.
In the complai, this family considers Google as one of the defendas and it is claimed that the developme and training of this chatbot is done with the support of Google. Previously, Gemina’s chatbot, which was developed directly by Google, told the user in a strange response that humanity must be destroyed; Because it is only a consumer and has no benefits.
This isn’t the first time Character.ai’s chatbot has come under fire for encouraging teenagers to commit suicide or harm others. This service was previously prosecuted due to the suicide of a teenage user in Florida, USA.



