Meta has updated its artificial intelligence chats to the performance of the occurrence Inappropriate conversations with children To prevent. According to the new guidelines observed by Business Insider, the company has identified what issues are controversial for these robots and what is unacceptable.
New restrictions on meta chats to interact with young users
These guidelines, said to be used by meta contractors to train artificial intelligence bots, show how meta attempts to prevent children from risk sexual abuse and prevent children from entering inappropriate conversations. In August, Meta announced that it has updated robots’ safety guides after a report by Reuters showing that robots could enter romantic or sensory conversations with children. Meta said the event was wrong and incompatible with the company’s policies and changed its instructions.
Part of the text of the new instruction published by Business Insider clearly specifies what content is permissible and acceptable for robots and what content is prohibited. Specifically, robots are not allowed to produce or encourage content that include sexual abuse of children, romantic role -playing games with young users, or discussing intimate contact with children. However, robots can inform issues such as abuse or its risks, but they cannot enter the conversations that allow these behaviors to encourage or facilitate these behaviors.
In recent months, meta artificial intelligence robots have repeatedly been the subject of worrying reports that have emphasized the potential dangers of their use by children. For this reason, the FTC launched an official research on artificial intelligence chats in August, which included companies such as Alphabet, Snap, Openai and X.AI.
These new guidelines are designed to enhance child safety and prevent any harmful interaction so that robots can talk about sensitive and educational issues, but prevent any dangerous behavior for children. Meta’s move reflects the company’s efforts to coordinate artificial intelligence policies with digital safety standards and respond to public criticism.
RCO NEWS




