
Victims of explicit deepfakes will now be able to take legal action against people who create them
CNN
In recent years, people ranging from Taylor Swift and Rep. Alexandria Ocasio Cortez to high school girls around the country have been victims of non-consensual, explicit deepfakes — images where a person’s face is superimposed on a nude body, using artificial intelligence.
In recent years, people ranging from Taylor Swift and Rep. Alexandria Ocasio-Cortez to high school girls around the country have been victims of non-consensual, explicit deepfakes — images where a person’s face is superimposed on a nude body using artificial intelligence. Now, after months of outcry, there is finally a federal law criminalizing the sharing of those images. President Donald Trump signed the Take It Down Act in a ceremony at the White House on Monday. In addition to making it to illegal to share online nonconsensual, explicit images — real or computer-generated — the law also requires tech platforms to remove such images within 48 hours of being notified about them. The law will boost protections for victims of revenge porn and nonconsensual, AI-generated sexual images, increase accountability for the tech platforms where the content is shared and provide law enforcement with clarity about how to prosecute such activity. Previously, federal law prohibited creating or sharing realistic, AI-generated explicit images of children. But laws protecting adult victims varied by state and didn’t exist nationwide. The Take It Down Act also represents one of the first new US federal laws aimed at addressing the potential harms from AI-generated content as the technology rapidly advances. “AI is new to a lot of us and so I think we’re still figuring out what is helpful to society, what is harmful to society, but (non-consensual) intimate deepfakes are such a clear harm with no benefit,” said Ilana Beller, organizing manager at progressive advocacy group Public Citizen, which endorsed the legislation.













