Taylor Swift AI Pics Prompt Lawmakers To Propose Anti-Nonconsensual AI Porn Bill

A new bill will criminalize the spread of AI-generated pornographic images in response to the Taylor Swift controversy.

By Nicole Dominique2 min read
Getty/AmySussman

Recently, AI-generated porn images of Taylor Swift went viral on X (formerly Twitter). The posts garnered over 100,000 likes and many retweets. While the original poster faced backlash, seeing how many people did not see an issue with the nonconsensual and fake image was disturbing.

In response, Senate Majority Whip Dick Durbin, who was joined by Sens. Lindsey Graham (R-SC), Amy Klobuchar (D-MN), and Josh Hawley (R-MO), introduced a bill titled The Disrupt Explicit Forged Images and Non-Consensual Edits (DEFIANCE) Act.

The measure will criminalize the spreading of fake and sexualized "digital forgeries" created by AI. Digital forgeries are "a visual depiction created through the use of software, machine learning, artificial intelligence, or any other computer-generated or technological means to falsely appear to be authentic."

The bill will allow victims depicted in sexually explicit photos to sue the "individuals who produced or possessed the forgery with intent to distribute it," according to The Guardian.

“This month, fake, sexually explicit images of Taylor Swift that were generated by artificial intelligence swept across social media platforms. Although the imagery may be fake, the harm to the victims from the distribution of sexually explicit ‘deepfakes’ is very real,” Durbin said in a recent press release.

Hawley added, “Nobody – neither celebrities nor ordinary Americans – should ever have to find themselves featured in AI pornography. Innocent people have a right to defend their reputations and hold perpetrators accountable in court. This bill will make that a reality."

The Washington Post reveals that between December 2018 and December 2020, the number of fake videos detected online doubled every six months. Sensity, an AI-powered deepfake detector, found that 90% of the videos were nonconsensual porn, many of which involved fake footage of women. With the rise of AI programs, I can only imagine how much worse it has gotten.

While those who heavily consume porn don't see the issue with AI-generated images (as evident in the comment section of X), victims of nonconsensual images may face emotional distress. Their reputation will also take a hit due to their likeness used for explicit materials.

I believe that the bill has a high chance of becoming law. With election year coming up, politicians will want to protect themselves against altered and weaponized deepfakes. By criminalizing the creation of AI-generated fake images, the bill can discourage any attempts to exploit deepfake technology for political gain and, in the process, protect women online. It's too bad it's taken this long.

Evie deserves to be heard. Support our cause and help women reclaim their femininity by subscribing today.