• Nimitz Tech
  • Posts
  • Nimitz Tech Hearing 9-9-25 Senate Judiciary

Nimitz Tech Hearing 9-9-25 Senate Judiciary

Whistleblower Allegations that Meta Buried Child Safety Research

NIMITZ TECH NEWS FLASH

Hidden Harms: Examining Whistleblower Allegations that Meta Buried Child Safety Research

Senate Judiciary Committee, Subcommittee on Privacy, Technology, and the Law

September 9, 2025 (recording linked here)

HEARING INFORMATION

Witnesses and Written Testimony (linked):

HEARING HIGHLIGHTS

Virtual Reality as Real Harm

Witnesses emphasized that experiences in virtual reality are perceived by children as real, both physiologically and psychologically. Because VR is immersive and embodied, harassment, sexual solicitation, or assault in these spaces can cause trauma equivalent to real-world experiences. This raised concern that children subjected to inappropriate or abusive encounters in VR are enduring genuine harm, not just digital interactions.

Suppression and Manipulation of Research

Both whistleblowers testified that Meta systematically suppressed, altered, and erased research findings that revealed harms to children. Legal teams oversaw studies, dictated language, and at times required researchers to delete data considered “too sensitive.” Third-party vendors were used to collect data so it could be destroyed if necessary, ensuring harmful findings would not create liability. This manipulation prevented accurate understanding of risks and allowed Meta to publicly claim safety progress while hiding damaging evidence.

Underage Users and COPPA Noncompliance

Evidence showed that large numbers of children under the age of 13 were active on Meta’s VR platforms, despite the company’s public insistence otherwise. Parents often created adult accounts for their children, or children created their own to access restricted content. Meta avoided tracking accurate age data, which allowed it to sidestep compliance with the Children’s Online Privacy Protection Act. Witnesses testified that leadership, including senior executives, was aware of widespread underage use but ignored it because acknowledging it would reduce engagement metrics.

IN THEIR WORDS

“Let’s be clear, virtual reality is reality. These harms are real, and this abuse happens every single day, every day on Meta’s reality platforms.”

- Chair Blackburn

“Meta is incapable of change without being forced by Congress… They have, frankly, had unearned opportunities in order to correct their behavior, and they have not.”

 - Dr. Sattizahn

“I am here to tell you today that Meta has changed for the worse; Meta has spent the time and money it could have spent making its products safer, shielding itself instead.”

 - Ms. Savage

SUMMARY OF OPENING STATEMENTS FROM THE FULL COMMITTEE AND SUBCOMMITTEE

  • Subcommittee Chair Blackburn thanked the witnesses for their bravery and stated that Meta had knowingly allowed child exploitation and abuse on its platforms. She recalled earlier whistleblower testimony showing Meta prioritized profit over children’s safety and condemned the company’s suppression of negative research. She highlighted that virtual reality harms children because abuse in that environment is experienced as real trauma. Blackburn pledged to pass the bipartisan Kids Online Safety Act and hold Big Tech accountable.

  • Subcommittee Ranking Member Klobuchar thanked the whistleblowers and emphasized that Meta used algorithms that promoted harmful content, facilitated bullying, and even enabled drug sales. She recalled Frances Haugen’s 2021 testimony that revealed Meta’s products harmed users and said the company instead sought plausible deniability by blocking and deleting safety research. She warned that Meta exploited children in virtual reality spaces, including lowering its official age limits, while prioritizing profits from kids’ data. Klobuchar urged bipartisan action to pass the Kids Online Safety Act and repeal Section 230 to ensure accountability.

  • Full Committee Chair Grassley praised Blackburn’s leadership and affirmed his long-standing support for whistleblowers in both government and the private sector. He cited past hearings, including one on Twitter, which revealed national security concerns, and noted recent letters to Meta regarding targeted ads against children. Grassley alleged that whistleblowers Jason Satish and Casey Savage suffered retaliation after raising legal compliance concerns, with one being fired and the other pressured to avoid risks to the company. He thanked the witnesses for their courage and pledged to continue investigations into Big Tech misconduct.

  • Sen. Blumenthal thanked Blackburn for her dedication and called the whistleblowers “truth tellers” who revealed how Meta suppressed research and prioritized profits over children’s safety. He recalled prior whistleblower disclosures, including Frances Haugen’s testimony on Instagram’s harm to teen mental health and Arturo Behar’s testimony on teens’ dangerous experiences. He accused Meta of obstructing research, destroying harmful data, and betraying parents by knowingly covering up abuses. Blumenthal compared Meta to Big Tobacco, declared the company had no conscience, and vowed to keep fighting until the Kids Online Safety Act became law.

SUMMARY OF WITNESS STATEMENT

  • Dr. SattizahnMr. testified that Meta systematically manipulated and suppressed research—especially on virtual reality (VR)—to protect engagement metrics and profits at the expense of user safety, including children. He recounted work from 2018–2024 showing Marketplace-related harms, false statements to Congress, and post-Haugen lockdowns that restricted sharing, altered methods, and erased negative data. After moving to Reality Labs in 2022, he said legal teams controlled research and demanded deletion of evidence showing underage users in Germany were being solicited for sex acts and that women suffered emotional harm. He concluded that Meta was incapable of change without congressional pressure.

  • Ms. Savage stated that, as a UX researcher at Meta from 2019–2023, she saw the company prioritize engagement over child safety across products and respond to scrutiny with superficial features rather than real protections. She said Meta knowingly ignored widespread underage use of VR to avoid COPPA obligations, while VR’s immersive nature intensified harms such as bullying, sexual assault, solicitation for nude images, and exposure to adult content. She testified that legal gatekeepers censored or blocked research, discouraged measuring the prevalence of these harms, and that Meta avoided collecting data so it could claim deniability. She concluded that Meta had grown worse since earlier disclosures and cared more about profits and growth in emerging technologies than the emotional and physical safety of children.

SUMMARY OF KEY Q&A

  • Chair Blackburn asked Ms. Savage to explain VR harms, Ms. Savage said VR was immersive, embodied, and social, making interactions with strangers feel real and often unsafe. Blackburn added that most users were unknown to children, and Savage confirmed this. Dr. Sattizahn said harassment included spatial audio of sex acts, and Blackburn noted this caused children to react as if abused in real life, which Savage confirmed.

    Blackburn asked Dr. Sattizahn if it surprised him that Meta allowed AI chatbots to engage in sensual conversations with children, Dr. Sattizahn said no, explaining Meta could not manage even controllable ranking algorithms for safety. Blackburn asked if Meta intensified algorithms to compete with TikTok, and Sattizahn replied that engineers prioritized engagement over examining safety variables, showing Meta could not handle safety in AI either.

  • Ranking Member Klobuchar asked how Meta’s research program changed after Haugen’s 2021 disclosures, and Dr. Sattizahn said legal and management imposed funnel-like controls, surveilled and edited work, demanded deletions, altered findings, and siloed safety research. Klobuchar asked a series of yes/no questions, and Sattizahn confirmed that Meta halted child-safety research, restricted data, altered designs, modified reports, and deleted evidence of harm.

    Klobuchar asked about Project Horton, and Ms. Savage said it was approved then canceled without explanation, which only Zuckerberg could have overruled. She testified that acknowledging true user ages would have forced Meta to shut down accounts and lose engagement, that children could not distinguish fantasy from reality, that VR paired them with strangers, and that under-13 use was prevalent and leadership knew.

    Klobuchar asked if Meta knowingly allowed AI chatbots to risk children, and Dr. Sattizahn said yes, adding Meta had long targeted youth and was not doing more to protect them in AI than in VR.

  • Sen. Hawley asked if under-13 use was widespread and visible to leadership, and Ms. Savage said yes, including at the C-suite. Dr. Sattizahn said legal directors monitored employees’ posts of underage VR use, showing Zuckerberg must have known. Hawley noted Zuckerberg’s 2024 testimony denying under-13 users, and Savage said account age data was unreliable while behavior contradicted his claim. Sattizahn called Meta’s stance a “lie by avoidance,” confirmed evidence was erased and research manipulated, and explained sensitive work was siloed to create a false paper trail.

    Hawley pressed why Meta would keep children on the platform, and Savage said kids drove household adoption of VR, which meant profits. Hawley asked if profits outweighed safety, and Savage confirmed yes, adding she identified Roblox VR being used by pedophile rings to run strip clubs and pay children in convertible currency, which she flagged but Meta still hosted.

  • Sen. Padilla asked why parental supervision controls in Meta’s VR had such low adoption rates and what could be done to improve them. Dr. Sattizahn said internal research dating back to 2018 showed parental controls were insufficient to keep children safe and adoption rates of only 2–10% reflected that they were not built to be valuable for users. Ms. Savage added that the tools failed largely because parents lacked education, were unfamiliar with VR risks, and often did not use headsets themselves, leaving children unsupervised despite repeated warnings to Meta leadership.

  • Sen. Moody said parents were the first generation raising children in the social media era without congressional guardrails and asked if Meta knew parents were not using parental controls, which Ms. Savage and Dr. Sattizahn confirmed. Moody compared “stranger danger” to predators reaching children in their bedrooms, asked if Meta knew underage children were being propositioned, and Savage and Sattizahn said yes. Moody pressed whether lawyers instructed researchers to frame reports under privilege to shield findings, and Savage and Sattizahn confirmed. Moody compared this to covering up abuse in a warehouse and said online space should not excuse it, while Savage and Sattizahn rejected the practice. Moody cited reports that lawyers told researchers to avoid words like “non-compliant” or “illegal,” and Savage and Sattizahn confirmed. Moody said attorney-client privilege did not protect complicity, and Savage confirmed they were instructed to comply, while Sattizahn added that lawyers threatened their jobs if they resisted.

  • Sen. Coons asked about the importance of independent research, and Dr. Sattizahn said Meta would not change internally and only external oversight could prevent altered or erased findings. Coons asked what legislation should ensure, and Ms. Savage said access to raw data and early collection methods was critical. Coons asked how Meta discouraged whistleblowers, and Savage said it weaponized narrative by labeling Haugen’s disclosure a “leak” and portraying it as harmful to research. Sattizahn agreed.

  • Sen. Schiff asked what information Meta erased, and Dr. Sattizahn said legal barred survey questions about psychological harm, forced deletions of slides, and erased back-end data to avoid audits. Schiff asked what justification was given, and Sattizahn said legal explicitly said data was too risky because audits would show Meta knew of harms. Schiff asked if anyone pushed back, and Sattizahn said some researchers raised concerns but were ignored, silenced, or left out of fear. Schiff asked how management rejected concerns, and Sattizahn said one manager admitted he might be right but insisted legal’s orders had to be followed. Schiff asked if this reflected top leadership, and Sattizahn said even CTO Andrew Bosworth personally argued with researchers defending legal restrictions. Schiff asked if this culture existed at other companies, and Sattizahn said it was uniquely harmful to Meta.

  • Chair Blackburn asked whether Meta’s “funnel of manipulation” prioritized legal risk to profits and used third-party vendors to erase “too sensitive” data, and Dr. Sattizahn said legal required outside vendors so findings could be deleted and confirmed Oculus lacked age ratings while he tried to correct age data.

    Blackburn asked about research showing children sidestepped age rules and Meta’s pushback, and Ms. Savage said parents often created adult accounts or kids made their own to access restricted content because Meta failed to educate on age-appropriate accounts.

    Blackburn asked for the Germany case on the record, and Dr. Sattizahn said a family interview revealed an eight- or nine-year-old was sexually solicited in VR and legal later directed deletion of notes and recordings from the study.

  • Sen. Blumenthal said Meta offloaded safety to parents despite knowing controls were insufficient and tied this to a duty of care in the Kids Online Safety Act, and Ms. Savage said Meta’s “age assurance” study was shut down and that users’ distrust (“brand tax”) undermined truthful age collection. Blumenthal asked whether leadership had contacted the witnesses, and Ms. Savage said not directly beyond public responses.

  • Sen. Hawley asked the prevalence of minors’ exposure to sexual content in VR, and Ms. Savage estimated any child in social VR would encounter inappropriate sexual content.

    Hawley cited internal chatbot guidelines allowing romantic or sensual conversations with children and asked if Meta’s assurances gave comfort, and Dr. Sattizahn said in no way, shape, or form.

    Hawley asked about their tenures and hindsight, and Ms. Savage said she joined hoping to improve the company while Dr. Sattizahn said he would have taken even better documentation.

    Hawley asked if Meta was a force for good, and Ms. Savage said she did not see how it could be while Dr. Sattizahn said the company was “aggressively ambivalent to people.”

  • Ranking Member Klobuchar asked how Meta’s dominance (about 73% VR/MR share) magnified harm when it suppressed youth-safety research, and Dr. Sattizahn said Meta felt too big to fail, vacuumed up talent, imposed unethical rules, and stifled change across the ecosystem.

    Klobuchar pressed on integration of social content, and Dr. Sattizahn said Meta’s social-first VR uniquely merged Instagram and Facebook data into VR, creating an “everything” platform with unclear rules.

    Klobuchar asked if Meta stopped tracking unwanted sexual advances despite findings that 36% experienced them often or always, and Dr. Sattizahn said yes and added results were obscured across studies.

    Klobuchar asked about prevalence and mitigation, and Dr. Sattizahn recalled roughly 10–20% experienced molestation, groping, or solicitation (higher for women) and said management claimed safety was underfunded even for basic education.

    Klobuchar asked how online grooming in VR translates to real-world harm, and Ms. Savage said children quickly form trust online, readily believe age claims, and groomers follow playbooks to exploit that trust.

    Klobuchar asked why detection is uniquely hard in VR, and Ms. Savage said there is no log, the headset hides interactions in plain sight, and kids may normalize solicitation or not report it while still learning boundaries.

    Klobuchar asked whether victims should be able to sue and if Meta would change without courtroom accountability, and Dr. Sattizahn said external regulation and financial penalties were necessary while Ms. Savage said online harm was harm full stop and parents should be able to respond in court.

  • Sen. Blumenthal noted Meta’s public denial and cited reporting that lawyers coached surveys to avoid under-13 disclosures, and he said he and Chair Blackburn would demand Meta’s research, policies, and practices to test its claims.

  • Chair Blackburn asked if bonuses were tied to engagement rather than safety, and Dr. Sattizahn said leadership required safety work to be measured solely by engagement, which was a poor proxy for being safe.

    Blackburn asked if Zuckerberg was involved in suppressing child-safety research, and Ms. Savage invited context while Dr. Sattizahn said “Zuck reviews” ceased after 2021 so leadership could avoid a trail showing he saw sensitive findings.

    Ms. Savage added that after 2021 the only change was the absence of Zuckerberg’s name in review chains, and Chair Blackburn concluded Meta insulated itself with third parties, suppressed research, and used children as a profit center while urging Meta to testify and vowing to pass the Kids Online Safety Act.

ADD TO THE NIMITZ NETWORK

Know someone else who would enjoy our updates? Feel free to forward them this email and have them subscribe here.

Update your email preferences or unsubscribe here

© 2024 Nimitz Tech

415 New Jersey Ave SE, Unit 3
Washington, DC 20003, United States of America

Powered by beehiiv Terms of Service