The War on Deepfake Porn
US Congress to Pervs: Keep It In Your Pants, Not On Our Screens
So the problem is the rapid increase in deepfake pornography, where artificial intelligence is used to create fake sexual images of real people without their consent.

This has affected many individuals, from celebrities to high school students.
In 2023, CNBC reports, there was a 464% increase in deepfake porn production compared to the previous year.
Who is to Blame?
The primary producers of deepfake porn are:
- Individual users: Often young men, including high school students, creating nonconsensual content of classmates or acquaintances.
- Online communities: Dedicated groups on platforms like Discord and X (formerly Twitter) that create and share deepfake content.
- Commercial entities: Websites and creators who monetize deepfake porn production.
The methods for producing deepfake porn have become increasingly sophisticated and accessible. The source of the problem is the advancement in AI technology that allows for the easy creation and distribution of these fake images.
- Face-swapping technology: Originally used to place celebrity faces onto adult performers' bodies.
- "Nudify" apps: Software that can generate fake nude images from clothed photos.
- Text-to-image AI tools: Advanced AI models like Stable Diffusion can create custom AI models of individuals, which can then be used to generate explicit images based on text descriptions.
How to Tackle the Problem
To address this issue, US lawmakers are proposing two main solutions:
- The Take It Down Act, introduced by Senator Ted Cruz, aims to:
- Make it illegal to publish or threaten to publish deepfake porn
- Require social media platforms to remove such images within 48 hours of a victim's request
- Task the Federal Trade Commission with enforcing these rules
- A competing bill by Senator Dick Durbin would:
- Allow victims to sue those responsible for creating, possessing, or distributing deepfake porn images
Both bills are part of a broader effort by Congress to regulate AI and protect individuals from its harmful uses, particularly in the realm of nonconsensual intimate images.
Some Thoughts
Remember the "send nudes" meme? Or people regretting explcit photos or videos of themselves that were leaked/stolen/misused? Irrelevant. Now whether anybody has in fact been reckless enough to send their private parts shots to somebody else or not, if they got a few high resolurion pictures online, like on Facebook or Instagram any perv can create a realistic looking pornographic imagery with them, and no doubt at some point even generate a video clip, with a cloned voice, also sourced from their victim's videos. For instance, tools like ElevenLabs offer voice cloning capability to their subscribers, and they're not the only ones.
On the other hand, all those who did share their nudes, will be able to now claim they're deepfake if they get published ))
But seriously though, this is not a great aspect of this emerging tech.
Imagine your loved one's face put into the filthiest scenes and looking realistic. Deepfakes existed for a while but they were inaccessible to the majority due to complexity and/or cost. A tool that enables every stupid broke dude out there to take some girl's or even a child's picture and render it into a depraved scene and then put it out online is just going to make things so much worse.
Teens will likely be using this as thir new cyberbullying tactic.
Eventually this will get regulated worldwide but I suspect it will remain with us. Dark web, software shared on torrents... guys will keep making their own movies with celebrities, girls next door, favorite instagram or onlyfans models. Where there is demand people find a way, and the demand for it existed long before the tech, AI is just fulfilling it now.
Although lets hope the growing 'ai girlfriend' trend will capture some of those horny minds and keep them busy sexting with their virtual intimate companions on apps like Kupid.AI.
Data sources for this post include:
https://www.cnbc.com/2024/06/18/senate-ai-deepfake-porn-bill-big-tech.html
https://igp.sipa.columbia.edu/news/rise-deepfake-pornography
Published: Jun 21, 2024 at 1:10 PM
Related Posts

AI Body Swapper: Mimo by Alibaba
30 Sep 2024