FTC launches inquiry into tech companies offering AI chatbots to kids

0
563

FTC launches inquiry into AI chatbots and child safety

The Federal Trade Commission ordered seven tech companies to provide details on how they prevent their chatbots from harming children.

“The FTC inquiry seeks to understand what steps, if any, companies have taken to evaluate the safety of their chatbots when acting as companions, to limit the product’s use by and potential negative effects on children and teens, and to apprise users and parents of the risks associated with the products,” the consumer-focused government agency stated in a press release on their inquiry.

The seven companies being probed by the FTC are Alphabet, Character Technologies, Instagram, Meta, OpenAI, Snap, and xAI. Anthropic, owner of the Claude chatbot, was not included on the list, and FTC spokesperson Christoper Bissex tells Mashable that he could not comment on “the inclusion or non-inclusion of any particular company.”

Asked about deadlines for the companies to provide answers, Bissex said the FTC’s letters stated that, "We would like to confer by telephone with you or your designated counsel by no later than Thursday, September 25, 2025, to discuss the timing and format of your submission."

Mashable Light Speed

The FTC is "interested in particular" about how chatbots and AI companions impact children and how companies that offer them are mitigating negative impacts, restricting their use among children, and complying with the Children’s Online Privacy Protection Act Rule (COPPA). The rule, originally enacted by Congress in 1998, regulates how children’s data is collected online and puts the FTC in charge of that regulation.

Tech companies that offer AI-powered chatbots are under increasing governmental and legal scrutiny.

OpenAI, which operates the popular ChatGPT service, is facing a wrongful death lawsuit by the family of California teenager Adam Raine. The lawsuit alleges that Raine, who died by suicide, was able to bypass the chatbot's guardrails and detail harmful and self-destructive thoughts, as well as suicidal ideation, which was periodically affirmed by ChatGPT. Following the lawsuit, OpenAI announced additional mental health safeguards and new parental controls for young users.

If you're feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can call or text the 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can reach the Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text "START" to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. If you don't like the phone, consider using the 988 Suicide and Crisis Lifeline Chat. Here is a list of international resources.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

Αναζήτηση
Κατηγορίες
Διαβάζω περισσότερα
Παιχνίδια
Save 21% on this 2TB Crucial gaming SSD upgrade, perfect for your Steam Deck
Save 21% on this 2TB Crucial gaming SSD upgrade, perfect for your Steam Deck As an Amazon...
από Test Blogger6 2025-08-07 11:00:13 0 951
Παιχνίδια
The Stellar Blade system requirements are surprisingly low as PC demo impresses
The Stellar Blade system requirements are surprisingly low as PC demo impresses As an Amazon...
από Test Blogger6 2025-06-09 13:00:17 0 2χλμ.
Technology
Google Pixel 10 vs. Samsung Galaxy S25: Which smartphone is right for you?
Comparing Google Pixel 10 vs. Samsung Galaxy S25...
από Test Blogger7 2025-08-21 22:00:15 0 783
Food
This Old-School Diner Favorite Vanished From Menus But No One Knows Why
This Old-School Diner Favorite Vanished From Menus, But No One Knows Why...
από Test Blogger1 2025-10-07 17:00:07 0 265
Religion
Clothe Yourself in Jesus - Greg Laurie Devotion - July 26, 2025
Clothe Yourself in Jesus - Greg Laurie Devotion - July 26, 2025Saturday, July 26, 2025Clothe...
από Test Blogger5 2025-07-26 06:00:16 0 1χλμ.