AI Is Being Used To Sexually Exploit Women And Children. Melania Just Did Something About It.
A woman’s image was never meant to be public property, but as AI blurs the line between reality and violation, laws like the TAKE IT DOWN Act offer justice for victims.

On April 7, an Ohio man became the first person convicted under the TAKE IT DOWN Act. This federal law was designed to address something most women may not fully realize is happening to girls of all ages: AI-generated sexual exploitation using nothing more than faces pulled from the internet.
According to prosecutors, James Strahler II of Columbus, OH, created hundreds of explicit images of women who had never posed for them, never consented to them, and in many cases, didn’t even know the perpetrator. The images were then sent to people in their lives—coworkers and even parents—as a way to humiliate and control the victims. Pay attention to that detail.
This law addresses a reality where a woman didn’t have to do anything at all to suddenly become the subject of explicit material.
This law isn’t about leaked nudes in the way people have come to understand such scandals, like the massive iCloud hack that publicized private photos of actress Jennifer Lawrence, for example. This law addresses a reality where a woman didn’t have to do anything at all to suddenly become the subject of explicit material. Her likeness alone is enough, but in passing the TAKE IT DOWN Act, the Trump administration fundamentally said enough is enough.
What the “First Conviction” Actually Means
For years, victims of this type of image-based abuse didn’t have their own legal category. If a photo wasn’t stolen, if nothing had technically been shared without permission, then where was the clear, federal crime to point to, especially when the images themselves were artificially generated? The TAKE IT DOWN Act changed that. After being signed into law on May 19, 2025, the act makes it a federal offense to distribute or threaten to distribute nonconsensual explicit images, including those created with AI, and requires platforms to remove that content within 48 hours of its report.
Strahler’s case is the first proof that the law will actually be used, and that distinction matters. According to the U.S. Attorney’s Office, Southern District of Ohio, “Strahler used AI to create pornographic videos depicting at least one adult victim engaged in sex acts with her father. He then distributed those videos to the victim’s co-workers. He also messaged the mothers of the adult females and demanded nude photos of them, threatening to circulate explicit or obscene images he created of their daughters if they did not comply. He often called the victims and left voicemails of him masturbating or threatening rape. He referred to the victims’ specific home addresses in his threats.”
The Office continued: “Strahler also posted online AI-generated obscenities he created of children. He generated these files using the faces of minor boys from his community. He then morphed the face of the minor boys onto the bodies of other adults or children and created videos that depicted the boys engaged in sex acts. Strahler specifically created AI-generated obscenity of the minor boys having sex with their mothers and/or grandmothers.”
And with the increasing popularity and accessibility of AI tools, it’s frightening to realize that pretty much anyone can replicate his devious, disgusting acts. Laws like the TAKE IT DOWN Act will hopefully reshape perverse behavior once people understand they carry real consequences and not just headlines.
How Melania Helped Move the Needle
It would be easy to reduce Melania Trump’s involvement in the TAKE IT DOWN Act to a general interest in online safety, but in this case, her role was much more concrete than that. Melania’s 2018 BE BEST campaign during her husband’s first term was centered on things like cyberbullying and exploitation, children’s well-being, and social media harms, and after leaving the White House she began working with lawmakers on legislation addressing nonconsensual intimate imagery, including the emerging AI abuse abilities. In 2025, she joined Sen. Ted Cruz and Rep. Maria Salazar in hosting a Capitol Hill roundtable with victims and families, bringing teenage girls and parents into the room with members of Congress to describe, in explicit detail, how easily these images can be created and the pitiful amount of recourse they had once spread.
Melania described it as “heartbreaking” to see what teenagers and girls go through after being targeted, and explicitly called on Congress to pass the TAKE IT DOWN Act. She demanded that “we must prioritize their well-being by equipping them with the support and tools necessary to navigate this hostile digital landscape,” adding that “every young person deserves a safe online space to express themself freely, without the looming threat of exploitation or harm.”
Conversations like that roundtable and other related efforts had a tangible impact, thankfully. Lawmakers involved in the bill have pointed to victim testimony—particularly from teenage girls targeted by classmates using AI—as a turning point in getting broader support for its passage. After all, this isn’t a niche tech concern. Any minor is vulnerable to this abuse.
The bill passed the Senate unanimously and cleared the House 409-2. It was introduced by Republican Sen. Ted Cruz and Democratic Sen. Amy Klobuchar, an unlikely pairing that reflects how broadly understood this new threat has become.
If you know how fiery discourse often is between the American Left and Right, that level of agreement on a bill doesn’t really happen without a shared understanding that something critical is at hand. By the time the bill was signed into law on May 19, 2025, the White House called it a priority of Melania’s BE BEST initiative, and Melania said the law affirmed that “the well-being of our children is central to the future of our families and America.”
The Scale of What’s Already Happening
If this sort of stuff sounds like an edge case to you, or if you don’t yet know someone personally affected, the data tells a different story. According to a 2023 report, as many as 98% of all deepfake videos online are nonconsensual pornography, and what do you know? Roughly 99% of the people depicted in them are women. At the same time, the volume of deepfake content has dramatically surged, growing more than 500% in just a few years.
With the proliferation of AI tools in every person’s pocket, you can see how this isn’t a fringe issue. We genuinely need to be concerned because this is becoming a rapidly scaling issue that needs thoughtful attention.
At a congressional roundtable tied to the bill, a teenage girl described discovering that a classmate had used AI to generate explicit images of her when she was only 14 years old. She later said, “I knew I could never go back and undo what he did.”
As many as 98% of all deepfake videos online are nonconsensual pornography.
In Texas, high school girls were targeted with AI-generated nude images created and shared by some peers, and as a result, had to transfer schools entirely. The Strahler case from Ohio follows the same pattern but on a larger scale. AI technology removes the need for access, and with it, the last remaining barrier that once separated ordinary women from this kind of exploitation.
For years, women have been told to be careful about what they share online—vacation snaps in bikinis, GNO selfies with margs in hands, and more—as though the risk was tied to their own behavior. But in the wild, wild West that is the current-year digital landscape, that logic no longer holds.
AI systems can now take publicly available images—photos pulled from social media, school websites, even professional headshots—and generate explicit content that appears convincing enough to your average Joe and Jane that it’s real. In other words, the threat is no longer tied to what a person does, or a child, for that matter. Simply, the fact that their image exists in a digital environment where it can be manipulated is a threat in itself.
It’s true, men can certainly be targeted by this technology, and if you scroll through random comments on X posts, you’ll see that some are. But the demand driving this degenerate behavior is overwhelmingly sexual, and the targets are overwhelmingly female. Today, AI-generated explicit images don’t merely misrepresent a woman, they break down any boundary between her public persona and something deeply private.
And even if an image is eventually removed, the fact that it existed at all can follow someone into her classroom, her workplace, her relationships, and her reputation in ways that are nearly impossible to fully reverse.
When I approached my husband about a pregnancy announcement post last year, the conversation quickly grew beyond the fact that we were expecting. We started thinking about what it means not only to share that you’re bringing a baby girl into the world, but also to publish images of her at her most innocent age for the internet to keep on file forever. After all, she can’t yet personally consent—babies can’t talk, walk, feed themselves, or even fall asleep alone in many cases.
AI systems can now take publicly available images—photos pulled from social media, school websites, even professional headshots—and generate explicit content that appears convincing enough.
The internet is forever. I’d never shame someone for posting photos of their baby online, honestly, as I think it’s only natural to want to share with the world what brings you joy, but for our family, we realized we wanted to wait for her digital presence to be established when she could actually weigh in. Strahler generated AI-obscenities of minor boys from his own community, morphing their faces onto bodies of adults or other children, and then created disgusting videos depicting those victims in sex acts.
I spent years dreaming of motherhood, years working up to it, and for me, I wanted to protect my little blessing’s innocence from total criminal creeps like Strahler if I could help it. That’s just my personal choice, and one that may be more cautious than perhaps reality warrants.
Every woman, no matter her political leaning, should understand how the first conviction under the TAKE IT DOWN Act is a meaningful step that establishes that this kind of devious behavior goes beyond being unethical. It’s criminal and deserves repercussions. We need this more now than ever before because the technology itself isn’t slowing down. If anything, it’s becoming more accessible, more realistic, and harder to detect.
What lies ahead is a technological reality most women haven’t fully adjusted to yet, where control over your image no longer depends entirely on what you choose to share. It depends on whether the systems around you are strong enough to protect it from bad actors. Right now, those systems are still catching up.