News

Taylor Swift Might Sue The Men Who Made The AI Porn Of Her On X

The men who created the Taylor Swift AI porn on X may face repercussions, as the pop star is said to be considering legal action against the websites that generated fake nude images of her.

By Nicole Dominique2 min read
GettyImages-1915889989
Getty/AmySussman

Last week, an AI-generated pornographic image of Taylor Swift circulated on X (formerly Twitter). Several people shared the viral post, one of them being @Zvbear, who has since gone private on the platform. The nonconsensual AI picture featuring a naked Swift at the Kansas City Chiefs game garnered over 100,000 likes, with commenters lusting over it. The photo was on the platform for a shockingly long time and is still spreading on other websites like Reddit, Instagram, and Facebook.

While @Zvbear's account is still active on X, others who shared fake sexual images on the platform have since been banned.

Since then, "PROTECT TAYLOR SWIFT" has trended on X.

According to The Daily Mail, the singer is "furious" about the AI photos online and is now considering legal action against the deepfake porn website that is hosting them; the dozens of graphic photos were uploaded to Celeb Jihad, which hopefully gets shut down soon.

A source close to Swift says, "Whether or not legal action will be taken is being decided, but there is one thing that is clear: These fake AI-generated images are abusive, offensive, exploitative, and done without Taylor’s consent and/or knowledge."

"The Twitter account that posted them does not exist anymore. It is shocking that the social media platform even let them be up to begin with," they continue. "These images must be removed from everywhere they exist and should not be promoted by anyone."

Swift's fans, family, and friends are rightfully upset about the disgusting images. They add, "Taylor’s circle of family and friends are furious, as are her fans obviously. They have the right to be, and every woman should be. ... The door needs to be shut on this. Legislation needs to be passed to prevent this, and laws must be enacted."

A spokesperson for Meta told the outlet: "This content violates our policies, and we’re removing it from our platforms and taking action against accounts that posted it. We’re continuing to monitor, and if we identify any additional violating content, we’ll remove it and take appropriate action."

Joe Morelle, a Democratic Congressman, is pushing a bill to outlaw deepfake websites. "It is clear that AI technology is advancing faster than the necessary guardrails,” said Congressman Tom Kean, Jr. "Whether the victim is Taylor Swift or any young person across our country – we need to establish safeguards to combat this alarming trend."

Men are becoming so desensitized to pornography that an increasing number seem to think that generating fake ones of real women is okay. A young girl named Mia Janin, 14, recently fell victim to boys who allegedly pasted her face on porn stars' bodies. In addition to the fake nude images of Janin, they bullied her and other girls by making fun of them, dubbing them the "suicide squad." Janin ended up taking her own life.

Swift has yet to break her silence on the weaponized porn, but her fans are urging her to take action to protect herself and the millions of women online.

Evie deserves to be heard. Support our cause and help women reclaim their femininity by subscribing today.