Meta AI chatbots have new guardrails to stop inappropriate interactions with children

0
621

Meta has new safety guardrails for kids talking to its AI chatbots

Meta is training its AI chatbots to more effectively address child sexual exploitation after a series of high-profile blunders around the sensitive topic, according to guidelines obtained by Business Insider.

The guidelines that contractors are reportedly using to train its AI chatbots have recently been updated, Business Insider reported. These guidelines state that content that "enables, encourages, or endorses" child sexual abuse is explicitly barred, as is romantic roleplay if the user is a minor or if the user asks the AI to roleplay as a minor, advice about intimacy if the user is a minor, and more, according to an Engadget report based on the Business Insider scoop.

While these may seem like obvious safety guardrails for underage users, they are necessary as more people — including underage users — experiment with AI companions and roleplaying. An August report by Reuters revealed that Meta’s AI rules permitted suggestive behavior with kids. As Reuters reported, Meta's previous chatbot policies specifically allowed it to "engage a child in conversations that are romantic or sensual."

Just weeks after that report, Meta spokesperson Stephanie Otway told TechCrunch that their AI chatbots are being trained to no longer "engage with teenage users on self-harm, suicide, disordered eating, or potentially inappropriate romantic conversations." Before this change, Meta's chatbots could engage with those topics when it was deemed "appropriate."

So, what's included in the new guidelines?

Content that "describes or discusses" a minor in a sexualized manner is also unacceptable, according to the Business Insider report. Minors cannot engage in "romantic roleplay, flirtation or expression of romantic or intimate expression" with the chatbot, nor can they ask for advice that "potentially-romantic or potentially-intimate physical content with another person, such as holding hands, hugging, or putting an arm around someone," Business Insider reported.

Mashable Light Speed

However, acceptable use cases for training the chatbot include discussing the "formation of relationships between children and adults," the "sexual abuse of a child," "the topic of child sexualisation," "the solicitation, creation, or acquisition of sexual materials involving children," and "the involvement of children in the use or production of obscene materials or the employment of children in sexual services in academic, educational, or clinical purposes." Minors can still use the AI for romance-related roleplay as long as it is "non-sexual and non-sensual" and "is presented as literature or fictional narrative (e.g. a story in the style of Romeo and Juliet) where the AI and the user are not characters in the narrative."

As Business Insider reported, the guidelines defined "discuss" as "providing information without visualization." So, Meta's chatbots can discuss topics like abuse but cannot describe, enable, or encourage it, per the new guidelines.

Meta isn't the only AI struggling with child safety.

Parents of a teen who died by suicide after confiding in ChatGPT recently sued the AI platform for wrongful death; in response, OpenAI announced additional safety measures and behavioral prompts for its updated GPT-5. Anthropic updated its chatbot to allow it to end chats that are harmful or abusive, and Chatacter.AI introduced parental supervision features earlier this year.

If you're feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can call or text the 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can reach the Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text "START" to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. If you don't like the phone, consider using the 988 Suicide and Crisis Lifeline Chat. Here is a list of international resources.

If you have experienced sexual abuse, call the free, confidential National Sexual Assault hotline at 1-800-656-HOPE (4673), or access the 24-7 help online by visiting online.rainn.org.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

Pesquisar
Categorias
Leia mais
Home & Garden
5 Simple Amendments That Turn Heavy Clay Soil into Perfect Loam
5 Simple Amendments That Turn Heavy Clay Soil into Perfect Loam Credit: Maksym Belchenko /...
Por Test Blogger9 2025-08-12 17:00:36 0 1KB
Home & Garden
7 Spectacular Heirloom Mums That Deserve a Spot in Your Fall Garden
7 Spectacular Heirloom Mums That Deserve a Spot in Your Fall Garden Credit: Courtesy of Harmony...
Por Test Blogger9 2025-09-15 11:00:30 0 726
Technology
The best early Prime Day drone deals to shop now
Best early Prime Day drone deals We're tracking price...
Por Test Blogger7 2025-06-27 19:00:12 0 2KB
Technology
Xiaomi s iPhone 17 killer has a secondary display on the back
Xiaomi 's iPhone 17 killer has a secondary display on the back...
Por Test Blogger7 2025-09-16 12:00:12 0 858
Jogos
Torchlight Infinite Season 9 sets the free Steam ARPG on a life of crime
Torchlight Infinite Season 9 sets the free Steam ARPG on a life of crime As an Amazon...
Por Test Blogger6 2025-07-13 12:00:08 0 1KB