Deepfake pornography of Taylor Swift is flooding social media platforms from X to Facebook to Reddit, and it’s so bad that SF-based X, formerly Twitter, has blocked searches for the pop star’s name.
One graphic AI-generated image on X was up for 17 hours over the weekend, garnering approximately 45 million views, 24,000 reposts, and hundreds of thousands of likes and bookmarks before it was removed, as The Verge reported.
The images, some of them football-themed because of Swift's relationship with Travis Kelce, were reportedly both pornographic and contained elements of violence like blood, as in the censored image below.
Ever since Elon Musk took over Xitter, the platform has become notoriously under-moderated, content-wise. Still, these kinds of sexually explicit, manipulated images are technically banned under X’s policies, which prohibit “synthetic and manipulated media and nonconsensual nudity,” per the Verge. However, the level of virality and the sheer number of images in this case make it hard to remove each one and block every interaction.
In response, X has imposed restrictions on searches of “Taylor Swift,” but it's proven impossible to block every slight alteration in search terms. “This is a temporary action and done with an abundance of caution as we prioritize safety on this issue,” Joe Benarroch, head of business ops at X, said in a statement, as the Chronicle reported.
Meta has also restricted searches on Instagram, in which “Taylor Swift” still returns popular images of her, but “Taylor Swift AI” prompts a warning message.
It’s unclear where these deepfakes originated, but an investigation from tech watchdog 404 Media discovered that many of the viral images may have come from a Telegram group in which participants share explicit AI-generated images of women, usually made with AI-based Microsoft Designer.
Legislation banning pornographic deepfakes has lagged in response to their quick escalation. One 2023 study found that the creation of doctored images has risen 550% since 2019, and several AI image generator platforms don’t have rules around generating pornography with real people, per BBC.
This incident has prompted new calls from U.S. politicians to create laws to criminalize the creation of such deepfake images. Even the White House has weighed in — President Joe Biden’s press secretary, Karine Jean-Pierre, called the incident “alarming” during a news conference Friday.
And they’ve got the Swifties to back them up now, too. Upset Taylor Swift fans quickly hit back at the deepfake proliferation by creating the hashtag #ProtectTaylorSwift to post positive images, hoping they go viral, as the Chronicle reported.
Related: The San Francisco 49ers Are Headed to Their Eighth Super Bowl, and Taylor Swift Will Be There
Image of Taylor Swift at the Baltimore Ravens and the Kansas City Chiefs game on January 28, 2024. Photo by Perry Knotts/Getty Images.