Twitter Blocks Taylor Swift Searches Amid Pornographic AI Deepfakes
Photo: Steve Granitz/FilmMagic
Pornographic AI images of
Taylor Swift that were circulating on Twitter this week have caused the platform to limit searches for the star for all users. On Saturday, Twitter temporarily paused searches for “Taylor Swift” on the platform, seemingly to prevent the spread of the photos. Earlier this week, Twitter’s
@Safety warned users of its “zero-tolerance policy” and that they are “actively removing” non-consensual nudity images, but did not mention Swift by name. They wrote, “We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed.”
The photos have gotten the attention of
mulitple members of government, as well as the White House. On Friday,
White House press secretary Karine Jean-Pierre, said the explicit images that were circulating are “very alarming”. “We’re going to do what we can to deal with this issue,” she stated. “So while social media companies make their own independent decisions about content management, we believe they have an important role to play in enforcing, enforcing their own rules to prevent the spread of misinformation, and non consensual, intimate imagery of real people.”
SAG-AFTRA, where Swift is a union member, spoke out in support of legislation like the
Preventing Deepfakes of Intimate Images Act to prevent images like this. “The sexually explicit, A.I.-generated images depicting Taylor Swift are upsetting, harmful, and deeply concerning. The development and dissemination of fake images — especially those of a lewd nature — without someone’s consent must be made illegal,” their statement began. “We support Taylor and women everywhere who are the victims of this kind of theft of their privacy and right to autonomy.”