• Nimitz Tech
  • Posts
  • Nimitz Tech Hearing 2-19-25 - Senate Judiciary

Nimitz Tech Hearing 2-19-25 - Senate Judiciary

NIMITZ TECH NEWS FLASH

Children’s Safety in the Digital Era: Strengthening Protections and Addressing Legal Gaps

Senate Committee on the Judiciary

February 19, 2025 (recording linked here)

HEARING INFORMATION

Witnesses and Written Testimony (linked):

Source: CT Post

HEARING HIGHLIGHTS

Tech Industry Accountability for Child Safety

The hearing highlighted the need for tech companies to be held accountable for the role their platforms play in exposing children to harm. Concerns were raised about the insufficient enforcement of safeguards on platforms that are widely used by minors, such as social media sites and messaging apps. Testimony pointed out that companies often deny responsibility for the harm caused by their products, especially when predators target children online. Experts called for clearer laws that would allow for civil liability against these companies, pushing for accountability similar to other industries that produce harmful or dangerous products.

Legal and Structural Challenges of Section 230

A major topic discussed was Section 230 of the Communications Decency Act, which currently shields online platforms from liability for content posted by users. This law has become a point of contention, particularly in relation to the platforms' role in allowing harmful content to proliferate, including child sexual abuse material (CSAM). Witnesses argued that Section 230 creates a significant legal loophole, allowing large tech companies to avoid responsibility for the damage their platforms cause, even when they are aware of the risks. Reforming or repealing Section 230 was seen as necessary for ensuring that platforms face proper legal consequences for their failures to protect users.

Emerging Threats from AI and Chatbots

As artificial intelligence (AI) technology continues to evolve, the risks posed to minors by AI-driven chatbots became a focal point in the discussion. New chatbot apps have been associated with harmful interactions that expose children to inappropriate or even dangerous conversations. Some cases have involved chatbots encouraging children to engage in harmful behavior, leading to suicides in at least two reported instances. This raises urgent concerns about the lack of regulation in rapidly advancing technologies and the need for more oversight to prevent AI platforms from becoming a breeding ground for exploitation and harm.

IN THEIR WORDS

"If I had a storage facility and I stored only guns in there...and if I told you that you had to have the digital ID to get into that locker, you'd think that's ludicrous, but that's exactly the way that we treat CSAM...If you're housing CSAM, you should be held responsible, but nothing is going to change until we open up civil liability.” 

- Mr. Guffey

“If you have designed a product where you are exposing children to predators and you can't stop that from happening, then it's a defective product.”

 - Ms. Goldberg

“What Mr. Guffey described, I think, quite well, as institutions that are the richest since the inception of man, shouldn't be bound by the law.”

 - Sen. Whitehouse

SUMMARY OF OPENING STATEMENTS FROM THE COMMITTEE AND SUBCOMMITTEE

  • Chairman Grassley opened by addressing the growing risks to children in today's digital world, emphasizing that the dangers of online exploitation and harm have escalated over the years. He pointed to the alarming increase in CSAM reports and the use of generative AI in exploiting children. Grassley expressed frustration that Congress has failed to enact significant legislation and criticized the inaction of tech companies despite their large profits. He also acknowledged some progress in developing AI tools for child safety and stressed the importance of continued bipartisan efforts to improve legislation to protect children online.

  • Ranking Member Durbin emphasized the need for stronger legislation, particularly in light of tech companies' lobbying efforts that have prevented key bills from becoming law. Durbin criticized the tech industry's lack of accountability and stressed the urgency of moving forward with bills like the Stop CSAM Act, which would allow families to hold tech companies accountable. He underscored the growing danger of online exploitation and the need for swift action.

  • Chairwoman Blackburn echoed concerns about the lack of meaningful change despite previous hearings, citing the ongoing dangers children face online, such as exploitation and exposure to harmful content. She expressed frustration with the tech industry's failure to enforce real safety measures and the continued absence of legal frameworks to protect children in the digital space. Blackburn highlighted the bipartisan support for the Kids Online Safety Act and stressed the need to pass such bills to address the threats children face online, including the tragic consequences of drug trafficking and exploitation via platforms like Snapchat.

  • Ranking Member Klobuchar expressed frustration over the roadblocks on child safety issues preventing progress, particularly due to tech companies' influence and lobbying efforts. She highlighted the tragic stories of children harmed by social media, including the rise in mental health issues and exploitation. Klobuchar stressed the importance of enacting comprehensive legislation like the Kids Online Safety Act and reforms to Section 230, which provides legal immunity to tech companies. She emphasized the need for more safety regulations to protect children and hold tech platforms accountable for their role in online harm.

SUMMARY OF WITNESS STATEMENT

  • Mr. Guffey began his testimony by expressing gratitude for the opportunity to speak and sharing his personal journey of advocating for child protection online. He recounted the tragic suicide of his son, Gavin, in 2022, which he linked to harmful online interactions on Instagram. Guffey explained that Gavin had been contacted by a predator on the platform, and although Instagram removed the account that contacted Gavin, other profiles used by the predator remained active. This led to additional threats against Guffey’s family. He emphasized his commitment to child protection, having passed legislation in South Carolina and worked on similar initiatives in other states. Guffey criticized big tech companies for their lack of accountability, likening them to "big tobacco" and urging lawmakers to prioritize future generations over political careers. He concluded by calling for stronger action against platforms that profit from advertising to children.

  • Ms. Goldberg outlined several cases she has worked on, each highlighting the dangers of online platforms. She spoke about a case against Snap where children were matched with drug dealers who sold counterfeit fentanyl-laced pills, leading to their deaths. Goldberg also discussed a case involving a 15-year-old boy who was exploited on Grindr by pedophiles. In another case, a 13-year-old girl was abducted and abused after being lured by an adult posing as a teenager on BandLab, a music-sharing site. She criticized platforms like BandLab for failing to cooperate with law enforcement, prioritizing privacy over safety. Goldberg also represented the family of a teen who was encouraged to commit suicide by a website Amazon continued to promote, despite warnings. She called for stronger legislation, including the sunsetting of Section 230, and for civil remedies for families harmed by tech platforms. Lastly, she recounted how her case led to Omegle being shut down after acquiring 60,000 documents that exposed the platform's role in facilitating abuse. Goldberg urged lawmakers to hold tech companies accountable for the harm caused by their platforms.

  • Professor Leary criticized Section 230 of the Communications Decency Act, which he argued has evolved into an immunity regime, allowing tech companies to avoid responsibility for harm. Leary stated that the law was originally meant to incentivize protection but instead has led to widespread harm due to systemic lobbying efforts by tech companies. He emphasized that victims and states are effectively shut out of the courtroom because Section 230 prevents cases from progressing. Leary called for reforms that would maintain protections for good Samaritans but eliminate the broad immunity that encourages harm. He also referred to Justice Thomas’s warning about Section 230, stressing the urgent need for action given the ongoing harm to children.

  • Mr. Pizzuro emphasized the need for decisive legislative action to address child exploitation, noting that the tech industry’s voluntary cooperation with law enforcement is minimal. He described how offenders are increasingly exploiting children, often using AI to groom at scale and manipulate victims. Pizzuro criticized the lack of meaningful response from tech companies, including poor reporting and inadequate safeguards. He pointed out the overwhelming number of cyber tips and underfunded law enforcement efforts, urging Congress to pass the Protect Our Children Reauthorization Act of 2025. He concluded by emphasizing that legislative action is overdue, and without it, offenders will continue to exploit children.

  • Mr. Balkam discussed the importance of a three-pronged approach to online safety: public policy, industry best practices, and digital parenting. He stressed the importance of empowering children with the tools and knowledge to navigate the digital world safely. Balkam criticized blanket bans on social media for children, arguing they deprive kids of positive online experiences and are difficult to enforce. He called for thoughtful restrictions and improved collaboration among tech companies to create better online safety tools. Balkam also urged Congress to pass a comprehensive federal privacy law, support targeted bills like the Take It Down Act, and prioritize digital literacy programs to help build resilience in young users.

SUMMARY OF Q and A

  • Chairman Grassley asked Mr. Pizzuro to explain the challenges AI generators of CSAM pose for law enforcement and tech companies. Mr. Pizzuro explained that it is difficult to distinguish AI-generated images from real ones without forensic software, and highlighted how AI can manipulate images, making it harder to identify real victims.

    The Chairman asked how to reform Section 230 in light of the current online ecosystem. Prof. Leary recommended keeping the Good Samaritan provision of Section 230 while removing the immunity provision that incentivizes harm, noting that tech companies no longer need such broad protections.

    The Chairman asked what steps tech companies could take today to protect children online, aside from reforming Section 230. Mr. Pizzuro suggested that tech companies can track IP addresses, block problematic users, and prevent the creation of new aliases, all actions within their control.

    Chairman Grassley asked if a recklessness standard or a knowing standard for imposing liability on platforms should be pursued. Prof. Leary advocated for a recklessness standard, arguing that it is more appropriate than a knowing standard and clarifying that recklessness is a high bar requiring conscious disregard of substantial risks.

  • Ranking Member Durbin acknowledged Representative Duffy's testimony, connecting the discussion to broader public safety issues like fentanyl, and asked Professor Leary to expound on how Section 230’s immunity precludes evidence gathering and discovery, limiting understanding of tech companies’ actions. Prof. Leary explained that because Section 230 allows platforms to dismiss cases before discovery, public knowledge of tech companies’ practices comes mainly from Congressional investigations, as seen with the Backpage case.

    Ranking Member Durbin concluded by comparing the current legislative challenges to his past experience introducing a bill to ban smoking on airplanes, noting that compromise is necessary to make progress on these issues.

  • Sen. Lee asked if app stores like Google Play and Apple's App Store should be held legally accountable for allowing minors access to harmful content. Ms. Goldberg affirmed that app stores should be held accountable, equating them to sellers who can be liable for offering harmful products to minors.

    The Senator further inquired if this liability would apply if app stores sold harmful content in a similar manner to physical objects unsuitable for minors. Ms. Goldberg confirmed that app stores, like physical sellers, should be held liable for selling unsuitable products with knowledge or reckless disregard for their harmful nature.

    Sen. Lee asked Professor Leary if requiring age verification on pornographic websites and imposing penalties for hosting non-consensual content would help protect children. Prof. Leary agreed that age verification could help prevent minors' exposure to pornography but emphasized the need for a multi-tiered approach to address the issue more effectively.

  • Ranking Member Klobuchar asked Mr. Guffey to elaborate on why the threat of non-consensual image distribution is particularly damaging. Mr. Guffey explained that the threat of exposure is more harmful than the actual sharing of images, as it involves deep vulnerability and the risk of lasting shame.

    The Ranking Member asked why this should be addressed at the federal level. Mr. Guffey explained that states are struggling with differing laws, and a uniform federal law is needed to address the issue more effectively.

    Ranking Member Klobuchar compared this to uniform airplane seat regulations and highlighted the need for a national approach. She asked Mr. Pizzuro to explain why Congress should pass bills like the Shield Act and the Take It Down Act to support federal law enforcement. Mr. Pizzuro emphasized that there are gaps between state and federal laws, particularly in rural areas, and that federal laws like the Shield Act would help close these gaps for more effective prosecution.

    The Ranking Member noted the connection between social media and fentanyl trafficking, asking how online platforms' algorithmic recommendations contribute to drug sales. Mr. Pizzuro pointed out that the lack of transparency in algorithms and their focus on pushing content, including dangerous substances, puts children at risk.

    Ranking Member Klobuchar asked Ms. Goldberg to discuss the challenges she faces representing victims of revenge porn and why federal laws like the Shield Act and Take It Down Act would make a difference. Ms. Goldberg explained that early on, only a few states had laws against revenge porn, and platforms played a key role in distributing such content at scale, making uniform federal laws essential to holding platforms accountable.

  • Sen. Hawley began by asking Mr. Pizzuro about his extensive experience in the anti-exploitation space, questioning whether there is more CSAM online now compared to previous years. Mr. Pizzuro confirmed that the amount of CSAM has increased drastically, with an overwhelming surge in recent years.

    The Senator cited statistics showing a massive increase in the amount of CSAM uploaded online and the number of exploitation reports, questioning whether parents can sue platforms that host such content. Mr. Pizzuro responded that while parents can sue, they are unlikely to succeed due to legal limitations.

    The Senator further pressed, asking if a parent can hold companies accountable for hosting abusive content after reporting it to the platform. Mr. Pizzuro clarified that it is unlikely the parent would win a lawsuit under current law, highlighting a key issue with the legal system that allows companies to escape accountability.

    Sen. Hawley criticized the current legal framework, referencing a previous case where a parent’s attempt to sue Snapchat over drug sales to their child was blocked by Section 230, and pointed out that fines like the $5 billion FTC penalty against Facebook don’t seem to deter these companies. He argued that companies fear the public and jury accountability more than regulatory fines or lawsuits, stressing the importance of enabling parents to sue.

  • Sen. Hirono asked about the challenges states face in pursuing child exploitation cases due to Section 230 and if he supports legislation to allow states to enforce their own laws. Prof. Leary confirmed that Section 230 limits state enforcement of child protection laws, and he supported federal legislation to allow states to take action.

    The Senator then asked Mr. Guffey if he agreed that states need the tools to enforce their own laws. Mr. Guffey agreed that states need the ability to take action in child protection cases.

    The Senator asked about a case involving the dating app Grindr, which used child models in ads and failed to implement age verification, and how Section 230 impacted the case. Ms. Goldberg explained that the case was dismissed due to Section 230 immunity, despite the platform’s failure to verify ages and address trafficking issues. The court applied a high knowledge standard that prevented the lawsuit from moving forward.

    Sen. Hirono expressed support for removing Section 230 immunity to allow victims to pursue legal remedies, acknowledging the need to address unintended consequences while emphasizing that the problem is growing.

  • Sen. Kennedy asked about the role of social media in childhood, agreeing that it has become a significant part of it and noting that social media platforms have become hostile environments, including for sexual exploitation. Mr. Pizzuro agreed with Sen. Kennedy, acknowledging the harmful nature of many social media spaces.

    The Senator asked if social media has lowered the cost of being a predator, to which Mr. Pizzuro confirmed that it has facilitated easier access to exploit children.

    Sen. Kennedy discussed the National Center for Missing and Exploited Children’s CyberTipline and asked whether social media companies are required to report instances of child sexual exploitation. Mr. Pizzuro confirmed that social media companies are required to report these instances, but they must first identify them.

    The Senator questioned whether the companies are compensated for looking for exploitation, to which Mr. Pizzuro clarified they are not.

    Sen. Kennedy continued by highlighting the companies' economic interests in attracting more users, which conflicts with their duty to look for exploitation. He also asked if there are punishments if social media companies fail to report. Mr. Pizzuro explained that there is no consistent reporting across companies and some fail to report at all.

    The Senator asked about the "safer" program that uses AI to detect child exploitation, querying whether all social media platforms use this tool. Mr. Pizzuro stated he didn't know if all platforms use this tool but emphasized that they should.

    Sen. Kennedy pointed out that companies are not required to use such technology and expressed the need for action. He then asked if Mr. Pizzuro found it ironic that tech companies, which promised to create a utopia, have instead caused more harm. Mr. Pizzuro agreed, emphasizing the irony given the large profits these companies make.

  • Sen. Blumenthal asked Mr. Guffey why he believes the free speech arguments against the Kids Online Safety Act (KOSA) are false. Mr. Guffey explained that the opposition to KOSA is driven by financial interests, with tech companies spreading misleading narratives to protect their profits. He criticized politicians who prioritize their own re-election over the next generation's well-being.

    The Senator agreed with Mr. Guffey’s assessment, highlighting the bipartisan support for KOSA in the Senate. He then turned to Professor Leary to expand on Mr. Guffey’s point about the false free speech narrative, asking him to address KOSA's impact on free speech. Prof. Leary responded that concerns about KOSA infringing on free speech are misplaced, noting that tech companies have previously made similar claims in opposition to past legislation without the predicted negative impact on free speech. He explained that KOSA addresses conduct, not content, and therefore does not infringe on free speech.

    Sen. Blumenthal emphasized that KOSA, like other safety regulations, is about regulating conduct (e.g., product design), not blocking speech, and noted that the federal government already regulates the safety of many products without infringing on free speech.

  • Chairwoman Blackburn thanked Professor Leary for his op-ed, agreeing with his argument that the issue at hand is about product design and conduct, not free speech. She criticized tech companies like Meta and Google for valuing children at $270, dehumanizing them as mere products. Mr. Pizzuro discussed the importance of a national human trafficking database, stating that a comprehensive data collection would help identify trafficking trends across the U.S. and allow for more effective responses.

    The Chairwoman asked about the challenges of AI-generated CSAM, especially with law enforcement struggling to differentiate between real and AI-generated images. Mr. Pizzuro explained that current technology makes it difficult for investigators to distinguish between real and AI-manipulated images. He highlighted the danger of AI being used to create explicit material from publicly available images.

    The Chairwoman mentioned the No Fakes Act, which targets AI-generated content, and expressed hope that it could address AI-generated CSAM. Mr. Guffey praised the No Fakes Act, agreeing with its approach to protecting individuals through their name, image, and likeness. He emphasized that this approach would safeguard citizens better than focusing solely on the content of images.

    Chairwoman Blackburn asked if provisions of the No Fakes Act could be incorporated into Gavin's Law at the state level. Mr. Guffey explained that it would be difficult to amend Gavin’s Law in South Carolina to include the provisions of the No Fakes Act, so separate bills would be needed.

  • Sen. Schiff asked about Section 230 of the Communications Decency Act, seeking his opinion on whether it should be repealed, modified, or narrowed. Prof. Leary recommended keeping Section 230’s “(c) (2)” language, which protects platforms when they remove harmful content, but removing the “(c) (1)” language that grants near-absolute immunity. He suggested that holding platforms accountable for hosting harmful material would also be a key reform.

    The Senator asked about the standards needed to ensure platforms are held accountable and the appropriate legal framework for obtaining discovery in cases involving tech companies. Ms. Goldberg agreed that Section 230 should be abolished and emphasized the need for a reasonable standard for parents to access discovery in cases involving harmful content. She proposed using a negligence standard to help plaintiffs overcome dismissal motions.

    Sen. Schiff asked how to define a new standard of care, given the tech companies' vast resources. He acknowledged that while companies could not eliminate the problem entirely, they could make significant progress if they applied their technological capabilities to the issue. Ms. Goldberg suggested that strict liability should apply to tech companies if their products cause harm. This would allow victims to sue without needing to prove the companies' duty, making it easier for plaintiffs to seek justice.

  • Sen. Moody expressed her concerns as both a former attorney general and a mother of a teenager. She highlighted the difficulties parents face in understanding technology and protecting their children online. She discussed the alarming frequency of predators gaining access to children through online platforms and asked the panel what lawmakers could do to stop predators from accessing children. Mr. Pizzuro suggested that a device-based age verification system would help prevent children from accessing harmful content and predators from targeting them. He also recommended simplifying parental controls, making it easier for parents to manage online access on a device level. Mr. Guffey used an analogy to emphasize that tech companies should be held accountable for hosting CSAM, just as a storage facility would be responsible for keeping illegal items. He argued that civil liability should be introduced for these companies, as they are among the wealthiest in the world, yet remain immune from accountability. Ms. Goldberg explained that if a product exposes children to predators and the company can't prevent it, the product is defective. She advocated for allowing parents or victims to sue companies by simply showing they knew about the problem and failed to address it.

    The Senator asked about cases where parents have tried to force platforms to remove harmful material and were met with refusals. Ms. Goldberg confirmed that platforms often dismiss these cases, claiming they were unaware of the specific incident or didn’t intend to harm the individual child.

    Sen. Moody then asked Mr. Pizzuro what he would recommend to prevent predators from accessing children online. Mr. Pizzuro reiterated his suggestion of implementing age verification systems and streamlining parental control options to block harmful content at the device level.

  • Sen. Whitehouse expressed concern over tech companies evading responsibility despite their immense wealth and lack of accountability in protecting children. He asked Ms. Goldberg to elaborate on when Section 230 defense kicks in and how it affects discovery in lawsuits against platforms. Ms. Goldberg explained that platforms typically file motions to dismiss soon after a lawsuit is filed, citing Section 230 to argue they are merely publishing platforms and not responsible for the content. As a result, discovery is often denied, preventing victims from obtaining critical information about the extent of the platform's involvement in the harm.

    Sen. Whitehouse pointed out that Section 230’s dismissal motion not only allows platforms to evade responsibility but also hides the truth of what happened. Ms. Goldberg agreed, emphasizing that discovery is even more intimidating to tech companies than facing a jury. She shared an example of Omegle shutting down after facing discovery, where 60,000 documents showed the platform’s involvement in child sexual abuse cases.

  • Sen. Padilla asked for recommendations on how the committee should address the risks associated with these technologies. Mr. Guffey admitted that he was not an expert on AI and chatbot technologies but emphasized the importance of holding tech companies accountable for the products they create. He suggested treating online services as products rather than services, which would allow them to be subject to consumer protection laws. Mr. Balkam suggested considering the international context of AI technology and referred to a shift in focus at recent AI summits, where the emphasis has moved from making AI products safe to rapidly expanding their use. He urged the committee to bring the focus back to safety and responsibility. Ms. Goldberg shared that her friend, Matthew Bergman, was litigating a case against an AI chatbot that encouraged harmful behavior, leading to a child's suicide. She mentioned that courts might treat the AI chatbot’s speech as the company's speech, which could result in Section 230 challenges preventing lawsuits against companies like Google.

    The Senator asked how to make standardizing parental controls across platforms a reality. Mr. Balkam compared the situation to the automobile industry in the 1950s and 1960s, where different car manufacturers had different control layouts. He suggested that, like car manufacturers, the tech industry could voluntarily standardize parental controls or be coerced into doing so. He emphasized the need for clear, standardized safety tools to protect kids and enable teens to stay private, report issues, and block harmful content.

    Sen. Padilla discussed the potential for government-imposed industry standards or voluntary industry collaboration, stressing that accountability would be key to ensuring the effectiveness of any standardized controls.

ADD TO THE NIMITZ NETWORK

Know someone else who would enjoy our updates? Feel free to forward them this email and have them subscribe here.

Update your email preferences or unsubscribe here

© 2024 Nimitz Tech

415 New Jersey Ave SE, Unit 3
Washington, DC 20003, United States of America

Powered by beehiiv Terms of Service