Apples iPhone 17 launch met with protests by child safety advocates

0
547

Apple's iPhone 17 launch met with protests by child safety advocates

A massive banner was unfurled outside Apple's Cupertino headquarters on Tuesday morning, with the smiling face of CEO Tim Cook welcoming visitors to the company's California campus. But the message of the starkly monochromatic decoration wasn't a boast about the new iPhone upgrades about to be announced, instead reading: "New iPhone Still Spreading Child Sexual Abuse."

The image was an act of civil disobedience by the Heat Initiative, a tech accountability network of experts, parents, and youth advocates who have made it their mission to pressure Big Tech into taking more aggressive action against predatory behavior and threats to youth safety across their platforms.

Mashable reached out to Apple for comment about the Heat Initiative action, but hasn't yet received a response. Apple previously responded to accusations that its products enable child sexual abuse material (CSAM); the company was targeted in a related class action lawsuit filed late last year. Apple stated at the time that it takes CSAM seriously and has instituted additional guardrails to prevent its spread.

"Cook and Apple’s top executives know child sexual abuse is hosted and traded from iCloud, but they refuse to implement common sense detection and removal practices that are standard across the tech industry," said Heat Initiative CEO Sarah Gardner in a press statement.

"That means survivors of child sexual abuse are forced to relive the worst crimes imaginable over and over because of Apple’s negligence and inaction. Since Apple won’t act, today we did — and our message to Tim Cook is that we will not rest until he stops putting profits over the lives and safety of children and survivors."

Mashable Light Speed

The timing aligns with Apple's annual product-centered September event, which will see the company debuting its new lineup of devices and iOS26 upgrades on Tuesday at 10 a.m. PT.

This Tweet is currently unavailable. It might be loading or has been removed.

The Heat Initiative has spent the last two years pressuring Apple to enforce stronger protections against the alleged storage and dissemination of child sexual abuse materials (CSAM) through the company's iCloud storage platform, with the belief that device manufacturers should also be held accountable for the youth mental health crisis and other safety failures. In late 2021, Apple announced it was pausing then axing plans to launch an iCloud scanning tool that would automatically detect and source CSAM stored in users' private drives. At the time, privacy experts warned that the tool could be the entry point into a wave of surveillance. The company initially claimed that its new NeuralHash technology would still preserve user privacy, but later reversed course.

"Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it,” said Erik Neuenschwander, Apple's director of user privacy and child safety, in response to Heat Initiative in 2021 (statement originally published by Wired). "Scanning every user’s privately stored iCloud data would create new threat vectors for data thieves to find and exploit. It would also inject the potential for a slippery slope of unintended consequences."

Apple has faced more intense legal scrutiny over the last year, including a billion dollar class action lawsuit filed in December that alleged the company reneged on mandatory reporting duties and thus sold "defective products" that were incorrectly advertised as safe for young people.

"Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk," wrote Apple spokesperson Fred Sainz in response to the lawsuit. "We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. We remain deeply focused on building protections that help prevent the spread of CSAM before it starts."

The Heat Initiative has also organized around Meta's alleged youth safety failures, in an effort to push more companies into the spotlight.

Site içinde arama yapın
Kategoriler
Read More
Technology
OpenAI says GPT-5 hallucinates less — what does the data say?
OpenAI says GPT-5 hallucinates less — what does the data say?...
By Test Blogger7 2025-08-07 21:00:13 0 1K
Food
Beef Or Pork: What's Really In Texas Roadhouse Ribs?
Beef Or Pork: What's Really In Texas Roadhouse Ribs?...
By Test Blogger1 2025-09-08 16:00:09 0 595
Technology
Apples new warranty covers 3 products for $20 per month
Apple's new warranty covers 3 products for $20 per month...
By Test Blogger7 2025-07-23 14:00:18 0 1K
Music
The Most Collected Albums By 11 Big '90s Rock Bands (on Discogs)
The Album by 11 Big '90s Rock Bands That Fans Collect the MostEven if some rock fans have...
By Test Blogger4 2025-07-10 15:00:04 0 1K
Technology
Stuff Your Kindle Day is live until Sept. 20 — download free dark fiction books
Stuff Your Kindle Day: How to get free dark fiction books until Sept. 20...
By Test Blogger7 2025-09-16 10:00:15 0 459