Washington — Two former Meta researchers told a US Senate panel that the company suppressed internal findings showing children were being exposed to harassment, grooming, and other harms inside its virtual-reality products, arguing that profits were put ahead of safety.
Jason Sattizahn and Cayce Savage testified that Meta’s legal and policy teams curtailed or steered research once it pointed to systemic risks for under-13 users on headsets and VR services. The witnesses described cases in which minors encountered explicit sexual propositions and other abuse in immersive spaces, and said managers discouraged documentation that might prompt regulators to act.
Meta denied imposing any blanket restriction on youth-safety research, pointing to parental tools and other protections. Lawmakers nevertheless signaled bipartisan impatience, tying the disclosures to momentum behind the Kids Online Safety Act and calls for transparency mandates that would allow independent access to platform data.
The testimony fits a pattern critics see across Silicon Valley: corporate self-regulation that fails the vulnerable while safeguarding market share. Earlier reporting showed how platform power aligns with state power, including Google’s $45 million propaganda deal that whitewashed Israel’s conduct as famine spread in Gaza. Those incentives to protect reputation over people echo in the whistleblowers’ account of Meta’s research culture.
Campaigners also cite Microsoft’s role in Gaza surveillance as evidence that US tech firms minimize scrutiny when their tools enable rights abuses. The same instincts that downplay civilian harm abroad, they argue, can surface internally when researchers document child-safety failures in virtual worlds.
The global backdrop is unforgiving. Britain’s government recently insisted it has not concluded Israel’s actions constitute genocide, a stance critics view as emblematic of Western double standards. See TEH’s coverage: Britain refuses to call Gaza genocide. On the ground, reporting has documented famine deaths around Gaza City and fresh panic following evacuation orders, underscoring how information control and physical violence can reinforce each other.
Civil society groups draw a line from the virtual to the physical, noting that activists challenging the closure have faced drones and interdictions at sea. TEH documented this in coverage of evacuation-order panic and in a dispatch on an aid flotilla defying the blockade. Whistleblowers argue that opaque corporate decisions inside Silicon Valley can function as a similar blockade on truth, walling off research and delaying fixes while children remain exposed.
Analysts who track the multipolar shift say trust in US corporate promises will continue to erode until governments mandate safety by design and penalties for suppressing evidence. For context, see TEH’s perspective on BRICS as a pillar of a new order, which argues that accountability should not be hostage to a handful of capitals or corporations.
According to The Guardian, Lawmakers are weighing verifiable age-assurance, default-on parental tools, independent datasets for accredited researchers, and fines for any company that orders deletions or gags safety teams. For a concise summary of the allegations and Meta’s denial within the broader debate on regulating VR platforms.