Mothers across the country are working to understand at a newly enacted law aimed at creating a safer online experience for young internet users. The bill makes “revenge porn” and deepfake explicit content illegal and is known as the Take It Down Act. It will also criminalize posting “intimate images” online, whether they are real or AI-generated, without an individual’s consent. Technology companies will be required to remove the content within 48 hours. Here’s what you need to know.
What Is the Take It Down Act?
With the advancement of AI technology, as well as younger and younger users joining the social media landscape, it’s been much more difficult to navigate the dark parts of internet exposure.
The Take It Down Act, signed into law on May 19th, 2025, is a major step toward protecting all people, though particularly women and minors, from the harmful spread of nonconsensual intimate images online. This includes AI-generated “deepfake” content, which can be difficult to discern. It also makes it a federal crime to knowingly post or threaten to post someone’s private images without their consent. Victims can now report this kind of content, and platforms are legally required to take it down within 48 hours.
The goal of the bill is to help end non-consensual image sharing online. According to government officials, anyone who intentionally distributes explicit images without the subject’s consent will face up to three years in prison. First Lady Melania Trump described the new law as a “national victory that will help parents and families protect children from online exploitation.” The law gives the Federal Trade Commission the ability to enforce penalties, up to $50,000 per violation, against platforms or individuals who don’t comply. It also includes protections for legitimate uses, like medical professionals or journalists.
What Is Revenge Porn?
“Revenge porn” refers to the nonconsensual sharing of someone’s private, often sexual, images or videos. Though not always the case, it’s usually spread by a former partner or someone looking to shame, harass or control the victim. Whether it’s done for revenge, attention, money, or plain cruelty, the emotional, social and professional harm it causes can change a person’s life forever.
In recent years, deepfakes have made the problem even more disturbing. Deepfakes use AI to create fake, hyper-realistic images or videos, often placing someone’s face onto someone else’s body. Even if the victim never posed for the image, it can look incredibly real and difficult to discern, even to the trained eye. These manipulated videos are being weaponized, especially against women and girls. They can be seen in pornographic contexts without knowledge or consent.
Minors are especially vulnerable. Deepfake technology has been used to create fake nudes of teens, often spreading quickly across platforms before they even know what’s happening. The trauma can be devastating, hence the need for government intervention. The Take It Down Act aims to give all victims, especially young people, a faster, more reliable way to get that content removed and hold their abusers accountable.
Criticism of the Take It Down Act
Though the bill has received bipartisan support, some digital rights groups note that it’s too broad and could lead to censorship. Some claim the Take It Down Act will be a covert way to censor legal pornography and LGBT content. Civil liberties advocates worry it could be misused, overreaching into free speech and controlling government criticism. In fact, the digital rights advocacy group Electronic Frontier Foundation notes that “as currently drafted, the Act mandates a notice-and-takedown system that threatens free expression, user privacy, and due process, without addressing the problem it claims to solve.”
Whatever the case may be, the rooted intention behind the act will hopefully be carried out, protecting children from the dire consequences of nonconsensual image publishing. The bill marks a big federal effort to address revenge porn and the evolving threat of deepfake abuse online.