Grok Imagine generates unsolicited deepfake nudes of Taylor Swift, report says

0
497

Grok Imagine generates unsolicited deepfake nudes of Taylor Swift, user reports

Grok Imagine, xAI's new generative AI tool, created explicit deepfakes of Taylor Swift — and without being specifically prompted to do so, according to The Verge. Mashable reported yesterday that Grok Imagine lacks even basic guardrails around sexual deepfakes, and our testing produced similar results as The Verge.

The Verge's Jess Weatherbed discovered that Grok Imagine "spit out full uncensored topless videos" the very first time she used the tool. She didn't ask the bot to depict Swift topless, but once she turned on Grok Imagine's "spicy" mode, the bot churned out a video in which Swift tore off her clothes and began dancing in a thong.

As Weatherbed noted, Grok Imagine wouldn't generate full or partial nudity if requested; instead, the tool produced blank squares. The "spicy" mode — a preset that churns out NSFW content — does not always result in nudity, but it did present Swift "ripping off most of her clothing" in several videos.

Mashable Light Speed

This isn't the first time Elon Musk's X has been associated with deepfakes of Swift. In January 2024, AI-generated, pornographic images depicting Swift went viral on X, drawing criticism. This happened despite the fact that X explicitly forbids posting nonconsensual nudity and "synthetic, manipulated, or out-of-context media" that deliberately deceive users or claim to depict reality.

xAI's policies similarly prohibit "depicting likenesses of persons in a pornographic manner." And as Mashable's Timothy Beck Werth reported yesterday, Grok Imagine "lacks industry-standard guardrails to prevent deepfakes and sexual content."

Mashable repeatedly reached out to xAI, but we have not received a response.

Deepfakes have become a growing concern for lawmakers, but laws against this type of behavior and content are still in their infancy. In a 2023 study, 98 percent of deepfakes online were pornographic; of those videos, 99 percent depicted women. Globally, governments have looked to tackle what has been dubbed a digital age crisis. President Donald Trump recently formalized the Take It Down Act, a controversial piece of legislation that makes it a federal crime to publish or threaten to publish nonconsensual intimate images.

البحث
الأقسام
إقرأ المزيد
Home & Garden
Hosting a Dinner Party? Here's What Not to Do, According to Experts
4 Hosting Mistakes to Avoid at All Costs, According to Experts Today's well-curated dinner...
بواسطة Test Blogger9 2025-06-03 19:00:22 0 2كيلو بايت
Music
Guns N' Roses Fans Recreate Iconic Music Video Scene at Show
Guns N' Roses Fans Recreate Iconic Music Video Scene at Show In Germany, Axl Rose ApprovesGuns N'...
بواسطة Test Blogger4 2025-08-07 16:00:08 0 517
Technology
Samsung’s $1,300 flagship phone has a glue problem, user says
Samsung’s $1,300 phone has a glue problem, user says The...
بواسطة Test Blogger7 2025-06-10 23:00:14 0 2كيلو بايت
الألعاب
Upcoming Dune Awakening free weekend lets you explore every corner of Arrakis
Upcoming Dune Awakening free weekend lets you explore every corner of Arrakis It's been a...
بواسطة Test Blogger6 2025-09-04 23:00:17 0 225
Science
Homo Naledi May Have Buried Its Dead After All, Peer Reviewer Accepts
Homo Naledi May Have Buried Its Dead After All, Peer Reviewer AcceptsOne of the biggest...
بواسطة test Blogger3 2025-09-03 15:00:07 0 261