Advertisement

Those explicit Taylor Swift deepfakes are ‘sexual exploitation,’ lawmakers say

A woman in a shiny black outfit sings into a microphone onstage
Taylor Swift’s fans and lawmakers came to the singer’s support after NSFW deepfakes spread on social media.
(Rick Scuteri / Invision / Associated Press)
Share

Multiple lawmakers have bad blood with the creators behind those sexually explicit AI images of Taylor Swift that circulated on social media Thursday.

New York Reps. Joe Morelle and Yvette Clarke, both Democrats, turned their sights on the latest high-profile incident of deepfake porn, condemning the increased creation and proliferation of artificial intelligence-generated images that overwhelmingly affect women and children.

“The spread of AI-generated explicit images of Taylor Swift is appalling — and sadly, it’s happening to women everywhere, every day,” Morelle said in a tweet.

“It’s sexual exploitation,” Morelle added, before touting his proposed Preventing Deepfakes of Intimate Images Act, a bill that would make it illegal to share deepfake pornography without the consent of individuals being portrayed.

A mother and her 14-year-old daughter are advocating for better protections for victims after AI-generated nude images of the teen and other female classmates were circulated at a high school in New Jersey

Dec. 3, 2023

In her tweet, Clarke noted that women have been subjected to nonconsensual deepfakes for years and that advancements in AI technology are making it easier and cheaper for people to create deepfakes.

Advertisement

“This is an issue both sides of the aisle & even Swifties should be able to come together to solve,” Clarke wrote.

The politicians posted their statements hours after Swift’s fans expressed outrage on X (formerly Twitter). “Protect Taylor Swift” became a rallying cry for Swifties on Thursday as they supported the Grammy winner. The doctored pictures were pornographic in nature and referenced the “Midnights” singer’s high-profile romance with Kansas City Chiefs tight end Travis Kelce.

A number of Swifties took it upon themselves to report the X accounts sharing the Swift deepfakes. “Actually terrifying that they exist. Please report + don’t give more attention to those tweets,” said user @naboocoffee, who shared a screenshot of their reporting activity.

“We gotta protect taylor swift from all this AI b—. I don’t care if she doesn’t know me, it’s basic human decency to not have your likeness be exploited by making AI porn!!!,” posted @everhero13.

“Protect her, don’t missuse tech,” wrote @WhyParker_. “Taylor Swift AI is disgusting as hell.”

Hours after the images circulated, X’s safety team reminded users on Thursday evening of its “zero-tolerance policy” on sharing “Non-Consensual Nudity (NCN) images.” The statement, which did not explicitly mention Swift, also said that users that posted the images would be held accountable.

“We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed,” added X’s safety account. “We’re committed to maintaining a safe and respectful environment for all users.”

A representative for Swift did not respond to The Times’ multiple requests for comment.

The AI images spurred public outrage a month after a New Jersey mother urged lawmakers to enact more protections against AI technology after deepfake nude images of her 14-year-old daughter and other female classmates were circulated at their high school.

“We’re fighting for our children,” said Westfield resident Dorota Mani. “They are not Republicans, and they are not Democrats. They don’t care. They just want to be loved, and they want to be safe.”

Advertisement

We haven’t agreed on guardrails against deepfakes. But this fictional FAQ (from five years in the future) shows the events of 2024 may force the issue.

Jan. 7, 2024

Last year, popular Twitch streamers Imane “Pokimane” Anys, Maya Higa and QTCinderella spoke out against deepfakes after learning that their likenesses were used without their consent for an AI porn webiste.

“Everybody f— stop. Stop spreading it. Stop advertising it. Stop,” tweeted QTCinderella. “Being seen ‘naked’ against your will should NOT BE A PART OF THIS JOB.”

In recent years AI has become a headache for Hollywood and other creators. Amid last year’s actors’ strike, Tom Hanks warned his followers of a dental plan advertisement that used an “AI version of me.”

“I have nothing to do with it,” he said in an Instagram post.

Swift may be a high-profile star who reportedly joined the billionaire club last year, but her fans says she is still a “real person” deserving of respect and dignity.

“‘She’s a white billionaire’ is never an excuse to spread AI images of sexualizing women,” said @jdjoshi60. “PROTECT TAYLOR SWIFT.”

Times staff writer Brian Contreras and the Associated Press contributed to this report.

Advertisement