Trump-backed Take It Down Act nears final approval


  • The Take It Down Act, which recently passed the House, requires social media platforms to remove non-consensual intimate imagery (NCII), including deepfakes, within 48 hours of a victim’s request, shifting from passive «notice-and-take-down» to proactive enforcement.
  • It imposes federal penalties for publishing real or AI-generated intimate content without consent in interstate commerce, addressing gaps in uneven state laws.
  • Victims can seek justice without costly lawsuits, streamlining enforcement and reducing barriers to removing harmful content.
  • Backed by Republicans (Cruz, Salazar) and Democrats (Klobuchar, Dean), the bill passed overwhelmingly (409-2) and is poised to be signed into law.
  • Lawmakers emphasize protections for women, minors and survivors of revenge porn, holding both perpetrators and tech platforms accountable.

A controversial new internet regulation, the Take It Down Act, is set to become law after passing the House in an overwhelming 409-2 vote and being signed by President Donald Trump.

The Take It Down Act, authored by Senate Commerce Committee Chairman Ted Cruz (R-TX) and co-authored by Sen. Amy Klobuchar (D-MN), requires social media platforms to remove flagged non-consensual intimate imagery (NCII), including deepfakes, within 48 hours of a demand. Unlike traditional «notice-and-take-down» systems, the law obligates platforms to proactively hunt for and remove content. (Related: Tom Hanks warns against AI-generated DEEPFAKE ADS of him selling drugs.)

Its key provisions include criminalizing the publication of NCII, both real and AI-generated, in interstate commerce, imposing penalties on offenders; requiring social media companies and websites to remove deepfake pornography and NCII within 48 hours of a victim’s request; allowing victims to pursue legal recourse without needing costly lawsuits, addressing a gap in state laws where enforcement varies widely; and an exemption of good-faith reporting, such as disclosures to law enforcement, while maintaining First Amendment protections for lawful speech.

While 30 states currently have laws against sexual deepfakes, enforcement remains uneven, and victims often struggle to have harmful content removed from online platforms. The Take It Down Act establishes a federal criminal penalty for publishing NCII, including hyper-realistic AI-generated depictions and ensures victims no longer bear the burden of costly civil lawsuits to seek justice.

House passage of Take It Down Act empowers victims of revenge porn

Lawmakers from both parties hailed the passage of the bill as a necessary intervention against the explosion of digitally altered sexually explicit content, particularly targeting women and minors.

Cruz emphasized its significance for survivors: «The passage of the Take It Down Act is a historic win in the fight to protect victims of revenge porn and deepfake abuse. This victory belongs first and foremost to the heroic survivors who shared their stories and the advocates who never gave up. By requiring social media companies to take down this abusive content quickly, we are sparing victims from repeated trauma and holding predators accountable.»

Sen. Klobuchar also echoed a similar statement. «We must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse. These images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable.»

Reps. Maria Salazar (R-FL) and Madeleine Dean (D-PA), who spearheaded the House companion bill, stressed the legislation’s role in protecting vulnerable groups. Salazar called it a «win for women and girls,» while Dean highlighted its focus on Big Tech accountability.

«The Take It Down Act’s passage is a significant step forward in Congress’ responsibility to protect the privacy and dignity of Americans against bad actors and the most harmful developments of AI. It takes only minutes to create a deepfake or share intimate images without consent, yet the lasting consequences devastate its victims – often girls and women,» Dean said.

Visit Glitch.news for more stories like this.

Watch this short report from PBS discussing how deepfake videos are becoming increasingly difficult to spot.

This video is from the MyOpinionCounts channel on Brighteon.com.

More related stories:

WARNING: AI-powered DEEPFAKE VOICE SCAMS are now coming for your bank balance.

Microsoft AI releases scary new deepfake technology that could make many newscasters, podcasters obsolete.

Judge grants preliminary injunction blocking California’s new election deepfake ban: «The law likely violates First Amendment.»

Black athletic director in Baltimore high school allegedly framed White principal as racist using AI DEEPFAKE recording.

Fake job seekers using AI deepfakes infiltrate U.S. companies, fueling national security threat.

Sources include:

Reclaimthenet.org

Commerce.Senate.gov

Brighteon.com

Deja un comentario