Meta‘s internal research, revealed during testimony in a Los Angeles County Superior Court trial, found that parental supervision tools like time limits and usage restrictions did not significantly reduce teens’ compulsive use of social media, and that teens who’ve experienced adverse life events were even more likely to overuse platforms, sparking major questions about the effectiveness of parental controls and social media design in regulating youth behavior. The study, conducted in partnership with the University of Chicago and based on surveys of around 1,000 teens and their parents, concluded that neither parents’ reports nor teens’ reports of parental supervision were associated with teens’ attentiveness to their social media use, prompting plaintiffs’ attorneys to argue that social media companies should be held accountable for addictive product features rather than leaving the burden on parents. Meanwhile, Meta’s legal team maintained the research was focused on self-reported use rather than clinical addiction, and emphasized broader life circumstances as key drivers of problematic behavior. Instagram head Adam Mosseri testified but said he was only vaguely familiar with the internal “Project MYST” study, despite appearing to have green-lit the work. The findings come amid multiple lawsuits alleging harm to children from social platforms and could influence future regulatory and legal efforts.
Sources
https://techcrunch.com/2026/02/17/metas-own-research-found-parental-supervision-doesnt-really-help-curb-teens-compulsive-social-media-use/
https://www.techbuzz.ai/articles/meta-s-research-reveals-parental-controls-fail-teens
https://www.linkedin.com/posts/techcrunch_metas-own-research-found-parental-supervision-activity-7429628245450457088-rZRv
Key Takeaways
• Meta’s internal Project MYST research found that established parental controls and supervision had little to no impact on teens’ self-reported compulsive social media use.
• The study found that teens with higher numbers of adverse life experiences were more prone to compulsive use, challenging the notion that parental oversight alone can mitigate social media overuse.
• Meta’s legal defense framed the research as exploratory and focused on self-reported behavior, while plaintiffs argue the findings underscore responsibility of social media companies over product design that can foster addictive use.
In-Depth
A major internal study conducted by Meta and revealed in open court during a high-profile lawsuit has exposed uncomfortable realities about the effectiveness of parental controls on social media consumption among teens, raising fresh questions about how these platforms should be governed and what roles parents and tech companies play in protecting young users. The research, identified as “Project MYST,” was presented during testimony in Los Angeles County Superior Court where plaintiffs are accusing major social media companies, including Meta, of creating addictive products that have contributed to anxiety, depression, and other serious mental health issues among young users. According to internal research documents and testimony, the study involved surveying approximately 1,000 teens and their parents about social media habits, usage patterns, and the presence or absence of household supervision tools such as time limits or restricted access. The results were striking: neither parents’ reports of supervising their child’s social media use nor teens’ own reports of supervision correlated with improved self-regulation or lower levels of compulsive use. In other words, the mere presence of parental controls or supervision did not significantly change how attentively or responsibly teens monitored their own engagement with social platforms.
Plaintiffs’ attorneys seized on those findings to challenge the industry narrative that parental supervision tools are a sufficient safeguard for youth, arguing instead that social media companies have an obligation to design products that don’t exploit psychological vulnerabilities for profit. The research further suggested that teens who had experienced stressful life events — such as family dysfunction, harassment, or trauma — were more likely to lack the ability to regulate their social media behavior, a concern that resonates with broader debates about mental health and digital wellbeing. During testimony, Instagram’s head, Adam Mosseri, acknowledged the existence of Project MYST but said he had limited recall of its specifics, even as trial lawyers pointed to documentation indicating his approval of the research. This exchange underscored how internal findings about user behavior and product impact can differ from public messaging, fueling critics’ claims that social media platforms may downplay or overlook evidence that parental controls are ineffective.
Meta’s legal team, for its part, suggested that Project MYST was meant to explore teens’ own perceptions about their use rather than to define clinical addiction or product harm, framing the research as a step toward understanding user experience rather than a definitive statement on product safety. That defense plays into a broader strategy of shifting some onus back onto parents and individual circumstances, rather than acknowledging systemic product issues that could expose the company to more regulatory scrutiny or liability. Regardless of how the court ultimately rules, the release of this research into public proceedings is likely to influence ongoing conversations among policymakers, parents, and tech industry observers about how best to protect children online. It also highlights the limitations of existing tools and the need for a more comprehensive approach that considers product design, mental health support, and broader societal influences on teen behavior.

