In a significant move to combat the alarming rise of AI-generated deepfake pornography, U.S. lawmakers have introduced a groundbreaking bill that would require social media giants to proactively detect and remove nonconsensual sexually explicit images from their platforms. The Take It Down Act, sponsored by Sen. Ted Cruz (R-Texas), aims to hold tech companies accountable for the spread of deepfake porn, which has targeted everyone from celebrities to high school students.
Under the proposed legislation, publishing or threatening to publish deepfake pornography would become a federal crime, with perpetrators facing up to two years in prison for distributing nonconsensual sexual images of an adult and up to three years if the victim is a minor. Social media platforms would be required to develop a process for removing such images within 48 hours of receiving a valid request from a victim and make reasonable efforts to remove any copies, including those shared in private groups.
The Federal Trade Commission (FTC) would be responsible for enforcing these new rules, ensuring that tech companies comply with the 48-hour removal deadline. The bill, set to be formally introduced on Tuesday by a bipartisan group of senators, comes as a response to the staggering 464% increase in deepfake porn production in 2023 alone.
Read the full Summary Here 👉 The TAKE IT DOWN Act
Deepfake technology, which uses artificial intelligence to create realistic images and videos of individuals without their consent, has been increasingly weaponized to create nonconsensual pornography. A 2019 report revealed that 96% of all deepfake videos were nonconsensual pornography, with the vast majority targeting women and girls.
While there are currently no federal laws banning deepfake porn, several bills have been introduced in Congress, including the AI Labeling Act of 2023 and the DEFIANCE Act of 2024. However, these bills have yet to move out of committee, leaving victims vulnerable to the devastating consequences of having their images manipulated and shared online without their consent.
The Take It Down Act seeks to address the gaps in existing legislation by placing the responsibility on social media companies to proactively moderate and remove deepfake porn from their platforms. Sen. Cruz emphasized the importance of the bill, stating,
By creating a level playing field at the federal level and putting the responsibility on websites to have in place procedures to remove these images, our bill will protect and empower all victims of this heinous crime.
The bill has garnered support from anti-sexual-abuse advocates and victims of AI-generated pornography, many of whom have reported tech companies' indifference to the issue, placing the burden on the victim to control the spread of unwanted images. Elliston Berry, a 14-year-old victim from Texas, joined Sen. Cruz at the bill's announcement, highlighting the urgent need for federal action to protect minors from the devastating effects of deepfake porn.
As the use of AI to generate nonconsensual pornography continues to grow, lawmakers and experts stress the importance of swift and comprehensive legislation to address the issue. Nina Brown, a Syracuse University professor specializing in the intersection of media law and technology, emphasized that while laws are essential, social sharing platforms must also commit to investing resources in ensuring that deepfakes do not exist on their platforms.
The Take It Down Act marks a crucial step forward in the fight against AI-generated deepfake pornography, providing victims with the legal recourse and protection they desperately need. As the bill moves through Congress, it serves as a powerful reminder of the urgent need for comprehensive legislation to address the growing threat of deepfakes and protect individuals from the devastating consequences of nonconsensual pornography in the digital age.