Meta AI chatbots have new guardrails to stop inappropriate interactions with children

0
18

Meta has new safety guardrails for kids talking to its AI chatbots

Meta is training its AI chatbots to more effectively address child sexual exploitation after a series of high-profile blunders around the sensitive topic, according to guidelines obtained by Business Insider.

The guidelines that contractors are reportedly using to train its AI chatbots have recently been updated, Business Insider reported. These guidelines state that content that "enables, encourages, or endorses" child sexual abuse is explicitly barred, as is romantic roleplay if the user is a minor or if the user asks the AI to roleplay as a minor, advice about intimacy if the user is a minor, and more, according to an Engadget report based on the Business Insider scoop.

While these may seem like obvious safety guardrails for underage users, they are necessary as more people — including underage users — experiment with AI companions and roleplaying. An August report by Reuters revealed that Meta’s AI rules permitted suggestive behavior with kids. As Reuters reported, Meta's previous chatbot policies specifically allowed it to "engage a child in conversations that are romantic or sensual."

Just weeks after that report, Meta spokesperson Stephanie Otway told TechCrunch that their AI chatbots are being trained to no longer "engage with teenage users on self-harm, suicide, disordered eating, or potentially inappropriate romantic conversations." Before this change, Meta's chatbots could engage with those topics when it was deemed "appropriate."

So, what's included in the new guidelines?

Content that "describes or discusses" a minor in a sexualized manner is also unacceptable, according to the Business Insider report. Minors cannot engage in "romantic roleplay, flirtation or expression of romantic or intimate expression" with the chatbot, nor can they ask for advice that "potentially-romantic or potentially-intimate physical content with another person, such as holding hands, hugging, or putting an arm around someone," Business Insider reported.

Mashable Light Speed

However, acceptable use cases for training the chatbot include discussing the "formation of relationships between children and adults," the "sexual abuse of a child," "the topic of child sexualisation," "the solicitation, creation, or acquisition of sexual materials involving children," and "the involvement of children in the use or production of obscene materials or the employment of children in sexual services in academic, educational, or clinical purposes." Minors can still use the AI for romance-related roleplay as long as it is "non-sexual and non-sensual" and "is presented as literature or fictional narrative (e.g. a story in the style of Romeo and Juliet) where the AI and the user are not characters in the narrative."

As Business Insider reported, the guidelines defined "discuss" as "providing information without visualization." So, Meta's chatbots can discuss topics like abuse but cannot describe, enable, or encourage it, per the new guidelines.

Meta isn't the only AI struggling with child safety.

Parents of a teen who died by suicide after confiding in ChatGPT recently sued the AI platform for wrongful death; in response, OpenAI announced additional safety measures and behavioral prompts for its updated GPT-5. Anthropic updated its chatbot to allow it to end chats that are harmful or abusive, and Chatacter.AI introduced parental supervision features earlier this year.

If you're feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can call or text the 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can reach the Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text "START" to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. If you don't like the phone, consider using the 988 Suicide and Crisis Lifeline Chat. Here is a list of international resources.

If you have experienced sexual abuse, call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

البحث
الأقسام
إقرأ المزيد
Home & Garden
Here's Why You Should Soak Your Fall Mums in a Bucket of Water ASAP
The Simple Trick That Will Save Your Fall Mums from Fading Too Soon Here's Why You Should Soak...
بواسطة Test Blogger9 2025-09-24 22:00:22 0 176
الألعاب
Gorgeous cat-and-mouse city builder Whiskerwood lands soon, with a demo live now
Gorgeous cat-and-mouse city builder Whiskerwood lands soon, with a demo live now Against...
بواسطة Test Blogger6 2025-09-21 11:00:08 0 220
Science
Sharks Don’t Have Bones To Fossilize, So How Do We Know Megalodon’s Size?
Sharks Don’t Have Bones To Fossilize, So How Do We Know Megalodon’s Size?Sharks, rays, and...
بواسطة test Blogger3 2025-08-08 18:00:14 0 610
Music
Megadeth Fans React to Band’s Final Album + Farewell Announcement
Megadeth Fans React to Band’s Final Album + Farewell AnnouncementJoseph Okpako, WireImage/Getty...
بواسطة Test Blogger4 2025-08-14 18:00:11 0 618
Science
Battered Skull Confirms Roman Amphitheaters Were Beastly For Bears
Battered Skull Confirms Roman Amphitheaters Were Beastly For BearsAnalysis of a bear skull found...
بواسطة test Blogger3 2025-09-11 11:00:14 0 262