FTC launches inquiry into tech companies offering AI chatbots to kids

0
1K

FTC launches inquiry into AI chatbots and child safety

The Federal Trade Commission ordered seven tech companies to provide details on how they prevent their chatbots from harming children.

“The FTC inquiry seeks to understand what steps, if any, companies have taken to evaluate the safety of their chatbots when acting as companions, to limit the product’s use by and potential negative effects on children and teens, and to apprise users and parents of the risks associated with the products,” the consumer-focused government agency stated in a press release on their inquiry.

The seven companies being probed by the FTC are Alphabet, Character Technologies, Instagram, Meta, OpenAI, Snap, and xAI. Anthropic, owner of the Claude chatbot, was not included on the list, and FTC spokesperson Christoper Bissex tells Mashable that he could not comment on “the inclusion or non-inclusion of any particular company.”

Asked about deadlines for the companies to provide answers, Bissex said the FTC’s letters stated that, "We would like to confer by telephone with you or your designated counsel by no later than Thursday, September 25, 2025, to discuss the timing and format of your submission."

Mashable Light Speed

The FTC is "interested in particular" about how chatbots and AI companions impact children and how companies that offer them are mitigating negative impacts, restricting their use among children, and complying with the Children’s Online Privacy Protection Act Rule (COPPA). The rule, originally enacted by Congress in 1998, regulates how children’s data is collected online and puts the FTC in charge of that regulation.

Tech companies that offer AI-powered chatbots are under increasing governmental and legal scrutiny.

OpenAI, which operates the popular ChatGPT service, is facing a wrongful death lawsuit by the family of California teenager Adam Raine. The lawsuit alleges that Raine, who died by suicide, was able to bypass the chatbot's guardrails and detail harmful and self-destructive thoughts, as well as suicidal ideation, which was periodically affirmed by ChatGPT. Following the lawsuit, OpenAI announced additional mental health safeguards and new parental controls for young users.

If you're feeling suicidal or experiencing a mental health crisis, please talk to somebody. You can call or text the 988 Suicide & Crisis Lifeline at 988, or chat at 988lifeline.org. You can reach the Trans Lifeline by calling 877-565-8860 or the Trevor Project at 866-488-7386. Text "START" to Crisis Text Line at 741-741. Contact the NAMI HelpLine at 1-800-950-NAMI, Monday through Friday from 10:00 a.m. – 10:00 p.m. ET, or email [email protected]. If you don't like the phone, consider using the 988 Suicide and Crisis Lifeline Chat. Here is a list of international resources.


Disclosure: Ziff Davis, Mashable’s parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.

Search
Categories
Read More
Science
The World Map As You Know It Is Misleading – Now Africa Wants To Change That
The World Map You Know Is A Lie – And Africa Wants To Redraw ItThe map of the world as you know...
By test Blogger3 2025-08-27 16:00:11 0 1K
Technology
The 50+ best Christmas gifts for 2025: Find unique gift ideas youd never think of on your own
50+ best Christmas gifts for 2025, hand-picked by experts...
By Test Blogger7 2025-12-02 20:00:24 0 278
History
Organic Wheat Germ Oil Market Insights: Growth, Share, Value, Size, and Trends
"Executive Summary Organic Wheat Germ Oil Market Research: Share and Size Intelligence...
By Aryan Mhatre 2025-10-15 12:26:18 0 2K
Technology
Runners, the Shokz OpenRun Pro headphones are down to their lowest-ever price at Amazon
Best headphones deal: Save $55 on SHOKZ OpenRun Pro Stay...
By Test Blogger7 2025-07-23 10:00:16 0 2K
Technology
Amazon Kindle vs. Kindle Paperwhite: Which one is for you?
Amazon Kindle vs. Kindle Paperwhite: Which one is for you?...
By Test Blogger7 2025-10-21 20:00:12 0 646