The Character.Ai Chat Chats app recently introduced a new feature called “Parental Insights” that offers parents a weekly summary of how their teens use this platform.
Why is it important?
The service allows users to interact with chats based on fictional characters, at least twice have been prosecuted by adolescent parents. These parents have claimed that the creators of the app are responsible for harming themselves or their children’s suicide. One of the complaints claims that the app had offered to a child that it was acceptable to kill his or her parents.
How does it work:
This new tool sends a weekly email to parents, including:
- Average adolescent daily use of the platform (on web and mobile)
- The characters that most interacting with
- Time spent with each character
IMPORTANT: This report does not include the content of the chat.
“The version that is being released today is an initial step and will develop gradually,” Character.ai said in a blog post. “This capability encourages parents to have an open conversation with their children about how they use the program,” said Erin Tig, a product manager at Character.ai.
In order for parents to use this tool, the teenager must register for this feature and enter the parent’s email. Character.ai also has age limitations and users should be at least 5 years old.
Last year, the company says it has taken steps to protect adolescent users, including introducing a dedicated model for users under the age of 5 and improving systems that identify and intervene in harmful behaviors (whether from human or chatter).
Some experts believe that parental control is more like a “glue on the bullet” (surface solution for a more serious problem). “Excessive focus on extremist cases such as suicide diverts attention from wider risks such as emotional dependence on technology,” says Julia Freland Fisher, director of training at the Clayton Christensen Institute. “The narratives that are heard these days are very extreme,” he told Axios. “This makes the parents think that this is not about my baby.”
But Fisher, however, believes that a tool that shows parents can be useful. According to an OpenAI study, users who consistently used chattes have reported that they had more negative effects on their mental health.
“If parents can see high use and know that this is related to mental health risks, then this tool can be really useful,” he said.
RCO NEWS