A group of four current and former Meta employees—represented by Whistleblower Aid—has come forward alleging that Meta’s legal team has been actively curbing internal research into youth safety on its Horizon Worlds VR platform in the wake of earlier disclosures by Frances Haugen. The Washington Post reports that Meta lawyers screened, and in some cases blocked, studies exposing risks such as grooming and sexual propositions toward users under 13, including a documented incident involving a child under 10 during a 2023 research trip to Germany. Meta denies these claims, pointing to nearly 180 approved Reality Labs studies since 2022 and emphasizing product enhancements like parental supervision tools. The Senate Judiciary Committee is set to investigate during an upcoming hearing on “Hidden Harms.”
Sources: Washington Post, The Verge, New York Post)
Key Takeaways
– Alleged Suppression: Whistleblowers claim Meta’s legal team hindered or deleted internal research revealing serious safety risks to children using Horizon Worlds.
– Incident Included: A particularly troubling incident from 2023 in Germany allegedly involved a child under 10 being propositioned in VR, and researchers reportedly instructed to delete the evidence.
– Meta Disputes Claims: Meta maintains it has conducted nearly 180 youth safety studies since 2022 and highlights implemented parental safety tools, while the Senate prepares to probe the matter.
In-Depth
Meta—formerly Facebook—is now under growing scrutiny over whether it suffocated internal research that highlighted the risks its Horizon Worlds VR platform poses to minors.
Whistleblowers, backed by Whistleblower Aid, allege that Meta’s legal team requested the screening—and in some cases outright deletion—of findings that would expose dangerous content, including a 2023 case where a child under ten was propositioned during a VR research session in Germany. They say researchers were explicitly told to delete recordings and notes. Meta, however, has pushed back. Its representatives cite nearly 180 approved Reality Labs studies since 2022 focused on youth safety, along with product updates like parental supervision tools and default voice-filter settings in Horizon Worlds. Still, the whistleblowers argue these measures are reactive—adopted under public and regulatory scrutiny—and that systemic legal oversight prevented early, meaningful research.
The dispute sets up a political and legal showdown: the Senate Judiciary Committee is arranging a hearing titled “Hidden Harms” to confront these allegations and demand clarity. Meta defends itself as a corporate leader striving to safeguard users and advance emerging tech responsibly. But critics counter that if enforcement mechanisms and research are being stifled, no amount of post-factum tools will suffice—or prevent future harm.
This is not just about VR technology or internal memos; it’s about corporate duty, youth protection, and how transparent Silicon Valley giants truly are. At this pivotal moment, lawmakers must ensure the balance between innovation and safeguarding the most vulnerable isn’t tilted toward unchecked expansion at the expense of ethics.

