Meta has updated its artificial ielligence chats to the performance of the occurrence Inappropriate conversations with children To preve. According to the new guidelines observed by Business Insider, the company has ideified what issues are coroversial for these robots and what is unacceptable.
New restrictions on meta chats to ieract with young users
These guidelines, said to be used by meta coractors to train artificial ielligence bots, show how meta attempts to preve children from risk sexual abuse and preve children from eering inappropriate conversations. In August, Meta announced that it has updated robots’ safety guides after a report by Reuters showing that robots could eer romaic or sensory conversations with children. Meta said the eve was wrong and incompatible with the company’s policies and changed its instructions.

Part of the text of the new instruction published by Business Insider clearly specifies what coe is permissible and acceptable for robots and what coe is prohibited. Specifically, robots are not allowed to produce or encourage coe that include sexual abuse of children, romaic role -playing games with young users, or discussing iimate coact with children. However, robots can inform issues such as abuse or its risks, but they cannot eer the conversations that allow these behaviors to encourage or facilitate these behaviors.
In rece mohs, meta artificial ielligence robots have repeatedly been the subject of worrying reports that have emphasized the poteial dangers of their use by children. For this reason, the FTC launched an official research on artificial ielligence chats in August, which included companies such as Alphabet, Snap, Openai and X.AI.
These new guidelines are designed to enhance child safety and preve any harmful ieraction so that robots can talk about sensitive and educational issues, but preve any dangerous behavior for children. Meta’s move reflects the company’s efforts to coordinate artificial ielligence policies with digital safety standards and respond to public criticism.



