• Nimitz Tech
  • Posts
  • Nimitz Tech Hearing 3-11-25 - Senate Judiciary Subcommittee on Crime and Counterterrorism

Nimitz Tech Hearing 3-11-25 - Senate Judiciary Subcommittee on Crime and Counterterrorism

NIMITZ TECH NEWS FLASH

Ending the Scourge: The Need for the STOP CSAM Act

Senate Judiciary Subcommittee on Crime and Counterterrorism

March 11, 2025 (recording linked here)

HEARING INFORMATION

Witnesses and Written Testimony:

Source: Stanford Cyber Policy Center

HEARING HIGHLIGHTS

The Role of Big Tech in Facilitating Child Exploitation

The hearing exposed how major technology companies knowingly allow child sexual abuse material (CSAM) and online enticement to thrive on their platforms while refusing to take meaningful action. Testimony revealed that Meta, Google, Apple, and Amazon collectively made over $200 billion in profits last year, yet they continue to claim that robust enforcement against CSAM is too expensive. Investigations showed that platforms like Facebook and Instagram actively expose underage users to predators through their algorithms, even serving advertisements for sex trafficking attorneys to minors flagged as vulnerable. Despite possessing sophisticated AI and detection tools, these companies are not required to proactively prevent exploitation. The discussion emphasized the need to hold tech companies legally accountable and strip them of broad legal protections if they fail to act.

The Rise of AI-Generated Child Exploitation Material

Artificial intelligence is fueling a new era of child sexual abuse material, making it easier than ever for predators to generate, modify, and distribute CSAM at scale. The hearing revealed that AI-generated CSAM reports increased by 1,300% in just one year, with offenders using text prompts to create abusive content instantly. Some predators are even manipulating real children’s images, superimposing their faces onto AI-generated abuse images, effectively producing new victims without direct contact. Despite this alarming trend, there are no clear federal regulations requiring AI companies to detect or prevent the creation of CSAM. The discussion underscored the urgent need for AI safety laws that mandate the detection and removal of AI-generated child exploitation materials before they spread.

The Need for Victim-Centered Legal Reforms

Survivor testimony highlighted the legal and systemic failures that leave victims powerless, even after their abusers are identified. Many victims are unable to sue tech companies that hosted their abuse material, as current federal law shields platforms from liability. Others struggle to receive restitution because their own families were complicit in their abuse, requiring legal guardianship solutions. Additionally, child victims are often re-traumatized during legal proceedings, facing privacy violations and hostile cross-examinations. The hearing emphasized the importance of expanding legal rights for victims, including stronger privacy protections, guaranteed court-appointed advocates, and the ability to sue tech platforms that profit from CSAM.

IN THEIR WORDS

“Meta made $23 billion in profits last year. Google made $60 billion. Apple made $97 billion. Amazon made $27 billion. So don’t tell me they don’t have the money to detect, disrupt, and report child sexual exploitation. The truth is—they don’t want to. Right now, they’re not compelled to, and they’re profiting from the status quo.”

- Chairman Hawley

“In 2014, we received 1.1 million reports of online child sexual exploitation. In 2023, that number was 36.2 million. And now, suddenly, in 2024, it’s dropped to 20.5 million. What happened? Did child predators suddenly stop abusing children? No. The crimes haven’t stopped—just the reporting. Companies are simply choosing not to see it.”

 - Ms. DeLaune

“Tech companies want us to believe that they’re powerless—that stopping child exploitation is just too difficult. But their own algorithms can identify child users, recognize sexual exploitation, and even serve up advertisements for sex trafficking attorneys. If they can do that, they can stop this.”

 - Sen. Blumenthal

SUMMARY OF OPENING STATEMENTS FROM THE COMMITTEE AND SUBCOMMITTEE

  • Chairman Hawley opened the hearing by emphasizing the urgency of addressing child sexual exploitation and the alarming rise in child sexual abuse material (CSAM) online. He highlighted the exponential growth of reported cases, from 1.1 million in 2014 to 36.2 million in 2023, stressing that each case represents a real child who has suffered abuse. Hawley introduced a legislative proposal to expand victim protections, improve the cyber tip line, and, most critically, allow victims to sue tech companies that host and profit from CSAM. He criticized major tech firms for failing to remove abusive content and argued that courtroom accountability is necessary for real change.

SUMMARY OF WITNESS STATEMENT

  • Ms. DeLaune emphasized the critical need for the STOP CSAM Act, noting that the U.S. has passed the inflection point in combating online child exploitation. She highlighted emerging threats, including AI-generated child abuse material, online enticement, and violent online groups. While the Cyber Tipline remains a vital tool, she expressed concern over a 7-million-report decline in 2023, partly due to inconsistent reporting from major platforms and the impact of end-to-end encryption on Facebook Messenger. DeLaune called for legislative measures requiring platforms to improve reporting standards, issue transparency reports, and create a report-and-remove process for victims.

  • Mr. Tanagho described the growing threat of live-streamed child sex abuse, often funded by American offenders directing crimes overseas. He cited research showing nearly half a million children in the Philippines are exploited this way, with U.S. offenders involved in at least 34% of cases. He argued that tech companies must prioritize victim safety by detecting, reporting, and removing CSAM more efficiently, as current responses remain inadequate. Tanagho urged lawmakers to pass the STOP CSAM Act, strengthen restitution systems, and ensure platforms are held accountable for their failures.

  • Mr. Schiller, a former prosecutor, described how his organization develops technology that helps law enforcement track, arrest, and prosecute child predators across 106 countries. He warned that despite current legal efforts, online predators continue to exploit new technology to target children, requiring both legislative and educational responses. Schiller stressed the need for the STOP CSAM Act to close gaps in existing laws, enhance victim support, and require tech platforms to cooperate with law enforcement. He also highlighted the importance of education for children and parents about online dangers, noting that many children are unaware of the risks they face on social media and gaming platforms.

  • Ms. Sines shared her personal story of online enticement and sextortion, beginning when a predator contacted her on Facebook at age 14. Despite reporting the abuse to her school resource officer and law enforcement, she initially received no help, leaving her vulnerable for two years before her images were publicly posted. She eventually found a law enforcement officer who took action, leading to the predator’s arrest and a 75-year prison sentence, but she emphasized that most victims do not receive the same level of support. Sines urged lawmakers to pass the STOP CSAM Act to ensure better law enforcement training, stronger victim protections, and greater accountability for tech platforms.

  • Mr. Pizzuro, a former New Jersey State Police ICAC Commander, stated that tech platforms’ failure to report child exploitation is putting children at risk. He criticized companies for submitting incomplete or blank reports to the Cyber Tipline, making it difficult for law enforcement to act. He emphasized the need for mandatory reporting of planned and imminent child exploitation crimes, arguing that better reporting requirements would prevent abuse before it happens. Pizzuro also called for stronger protections for survivors, as their personal information is often exposed in legal proceedings, allowing abusers to re-traumatize them. He urged Congress to pass the STOP CSAM Act, stressing that delaying action means leaving more children vulnerable to exploitation.

SUMMARY OF KEY Q&A

  • Chairman Hawley asked about tech companies’ failure to disrupt child exploitation, highlighting their massive profits and lack of enforcement action. Mr. Tanagho referenced Australia’s eSafety Commissioner’s findings that most companies do nothing to prevent live-streamed child exploitation, often citing cost concerns. Chairman Hawley then presented an experiment by the New Mexico Attorney General, where a fake underage profile was quickly targeted by adult predators on Facebook, showing that Meta’s own algorithm recognized child exploitation but did not prevent it. He asked Ms. DeLaune about tech companies’ role in combating abuse, and she stressed the need for mandatory transparency reports to track companies' responses.

  • Sen. Blackburn asked Ms. DeLaune about the impact of her REPORT Act, which increased the time platforms must retain evidence and mandated reporting of bad actors. Ms. DeLaune revealed a 192% increase in online enticement reports but expressed concerns over poor quality reporting from tech companies, with many reports coming from the public rather than the platforms. Sen. Blackburn then asked if statutory clarity was needed to ensure reporting consistency. Ms. DeLaune pointed to provisions in the STOP CSAM Act that would require platforms to include detailed, uniform data in Cyber Tipline reports to aid law enforcement.

    Sen. Blackburn also asked about the need for a National Human Trafficking Database to track trends across states. Mr. Tanagho supported the initiative, explaining that data collection is essential for measuring crime prevalence and the effectiveness of anti-trafficking programs.

  • Ranking Member Durbin emphasized the privacy protections in the STOP CSAM Act and asked why stronger enforcement provisions were necessary. Mr. Schiller explained that prosecutors often struggle to redact victim information, and defendants exploit legal loopholes to traumatize survivors further, making enforcement mechanisms critical.

    Ranking Member Durbin inquired about the importance of privacy protections continuing into adulthood for child exploitation survivors. Mr. Pizzuro confirmed that law enforcement struggles to obtain adequate information from tech companies, and STOP CSAM would improve investigation quality. Mr. Schiller responded that some companies were improving their privacy protections but only due to pressure from law enforcement rather than proactive responsibility.

  • Sen. Blumenthal discussed the Kids Online Safety Act (KOSA) and the need for a duty of care for tech platforms to prevent child exploitation. Mr. Pizzuro and Ms. DeLaune agreed, emphasizing that tech companies should verify user ages and prevent grooming behavior instead of relying on voluntary compliance. Sen. Blumenthal also asked about AI-generated CSAM, and Ms. DeLaune warned that generative AI is accelerating the production of child abuse content, with reports increasing by 1,300% in one year.

  • Sen. Klobuchar asked whether platforms should be required to immediately remove non-consensual intimate images. Ms. Sines strongly agreed, saying that tech companies currently delay or refuse to act, worsening victim trauma.

    Sen. Klobuchar then asked about the importance of codifying a notice-and-takedown law, such as the Take It Down Act. Mr. Tanagho argued that survivors should not be responsible for constantly requesting the removal of their own abuse materials, and tech companies should proactively prevent re-uploads.

    Sen. Klobuchar asked about the biggest prosecutorial challenges in child exploitation cases. Mr. Schiller said that while current criminal statutes are strong, STOP CSAM fills key gaps, such as funding for guardians ad litem and fiduciaries to protect minor and incapacitated victims. Sen. Klobuchar then asked why Section 230 reforms were necessary. Mr. Pizzuro responded that without financial penalties, companies will continue prioritizing profit over child safety.

  • Chairman Hawley asked about the role of Child Rescue Coalition’s technology in combating CSAM. Mr. Schiller described how their free software helps law enforcement track offenders and collaborate with NCMEC to streamline investigations.

    Chairman Hawley then revisited Ms. Sines' experience with Facebook, asking how long it took for her exploitative images to be removed. Ms. Sines said it initially took over a day but was expedited once she found a direct contact at Facebook. Chairman Hawley highlighted how the STOP CSAM Act would establish an independent board to ensure the quick removal of such content.

  • Sen. Britt focused on age verification, criticizing platforms for failing to verify users' ages and co-sponsoring the Protecting Kids on Social Media Act. Mr. Pizzuro explained that phones already contain age data, meaning platforms could easily implement age restrictions but choose not to. Sen. Britt concluded that Congress should mandate these protections, rather than leaving the decision to tech companies.

ADD TO THE NIMITZ NETWORK

Know someone else who would enjoy our updates? Feel free to forward them this email and have them subscribe here.

Update your email preferences or unsubscribe here

© 2024 Nimitz Tech

415 New Jersey Ave SE, Unit 3
Washington, DC 20003, United States of America

Powered by beehiiv Terms of Service