Elon Musk’s social media platform X has blocked searches for Taylor Swift as pornographic deepfake images of the singer have circulated online. Attempts to search for her name on the site Monday resulted in an error message and a prompt for users to retry their search, which added, “Don’t fret — it’s not your fault.”
Quick Read
- X has temporarily blocked searches for Taylor Swift due to the circulation of pornographic deepfake images of the singer on the platform.
- Users attempting to search for Swift’s name or variations encounter error messages, indicating the search function’s temporary restriction.
- The decision to block searches was described as a precautionary measure to ensure safety in response to the spread of explicit and abusive fake images of Swift.
- Swift’s fans, known as “Swifties,” initiated a campaign using the #ProtectTaylorSwift hashtag to combat the spread of deepfakes by sharing positive content about the singer.
- Reality Defender, a deepfake detection group, observed a significant increase in nonconsensual pornographic material featuring Swift, particularly on X, with some content also appearing on Facebook and other platforms.
- The offensive content involved AI-generated images depicting Swift in degrading and sometimes violent scenarios.
- The prevalence of explicit deepfakes, especially those targeting women, has escalated with advancements in technology that make creating such images more accessible.
The Associated Press has the story:
X pauses Taylor Swift searches as deepfake explicit images spread
Newslooks- (AP)
Elon Musk’s social media platform X has blocked searches for Taylor Swift as pornographic deepfake images of the singer have circulated online.
Attempts to search for her name on the site Monday resulted in an error message and a prompt for users to retry their search, which added, “Don’t fret — it’s not your fault.”
Searches for variations of her name such as “taylorswift” and “Taylor Swift AI” turned up the same error messages.
Sexually explicit and abusive fake images of Swift began circulating widely last week on X, making her the most famous victim of a scourge that tech platforms and anti-abuse groups have struggled to fix.
“This is a temporary action and done with an abundance of caution as we prioritize safety on this issue,” Joe Benarroch, head of business operations at X, said in a statement.
After the images began spreading online, the singer’s devoted fanbase of “Swifties” quickly mobilized, launching a counteroffensive on X and a #ProtectTaylorSwift hashtag to flood it with more positive images of the pop star. Some said they were reporting accounts that were sharing the deepfakes.
The deepfake-detecting group Reality Defender said it tracked a deluge of nonconsensual pornographic material depicting Swift, particularly on X. Some images also made their way to Meta-owned Facebook and other social media platforms.
The researchers found at least a couple dozen unique AI-generated images. The most widely shared were football-related, showing a painted or bloodied Swift that objectified her and in some cases inflicted violent harm on her deepfake persona.
Researchers have said the number of explicit deepfakes have grown in the past few years, as the technology used to produce such images has become more accessible and easier to use.
In 2019, a report released by the AI firm DeepTrace Labs showed these images were overwhelmingly weaponized against women. Most of the victims, it said, were Hollywood actors and South Korean K-pop singers.