Unsealed court documents reveal that Meta leadership, including CEO Mark Zuckerberg, prioritized teen user engagement despite internal research confirming the platform’s negative impact on youth well-being. These findings, stemming from internal reports and executive communications, highlight a systemic focus on retention over the safety of minors.
Internal Reports Confirm Negative Impact
A 2019 internal study conducted by Meta involved one-on-one interviews with users whose platform usage was flagged as “problematic”—a category affecting approximately 12.5% of the user base. The report explicitly stated that the best external research indicates Facebook’s impact on people’s well-being is negative.
Executive Focus on Teen Retention
Multiple documents reference statements from CEO Mark Zuckerberg and Instagram head Adam Mosseri regarding the prioritization of teen engagement. In one instance, Zuckerberg commented that for Facebook Live to succeed with teenagers, the company would need to be “very good at not notifying parents / teachers.”
Internal communications also show employees discussing the goal of increasing retention with high specificity. One email sent to Meta CPO Chris Cox noted that the product should be optimized for students “sneaking a look at your phone in the middle of Chemistry.”
In January 2021, Meta VP of Product Max Eulenstein acknowledged the disconnect between user intent and product goals, writing that while no one wakes up wanting to maximize their Instagram opens, “that’s exactly what our product teams are trying to do.”
Meta’s Defense and Modern Safety Features
A Meta spokesperson stated that many of the unsealed documents are nearly a decade old and emphasized that the company now incorporates feedback from parents, experts, and law enforcement. The company maintains that it no longer “goals” on teen time spent.
Implementation of Instagram Teen Accounts
Meta pointed to the 2024 launch of Instagram Teen Accounts as evidence of its commitment to safety. These accounts include built-in protections such as default private settings and restrictions on who can tag or mention minor users. Furthermore, the app sends time-limit reminders after 60 minutes of use, a setting that requires parental permission to change for users under 16.
Whistleblower Insights and Metaverse Concerns
Kelly Stonelake, a former Director of Product Marketing at Meta (2009–2024), described the unsealed evidence as consistent with her firsthand experience. Stonelake, who is currently suing the company for alleged gender-based discrimination, led go-to-market strategies for the VR social app Horizon Worlds.
During the rollout to teenagers, Stonelake alleges she raised concerns regarding the lack of effective content moderation tools within the metaverse. According to her testimony, these objections were not taken seriously by the company leadership.
The Legislative Battle Over Online Safety
U.S. government scrutiny of Meta intensified following the 2021 leaks by whistleblower Frances Haugen, which demonstrated that the company was aware of Instagram’s harmful effects on teenage girls. While Congress has proposed various bills to address these issues, they remain a point of contention.
Censorship and Preemption Risks
Some privacy activists argue that legislative efforts like the Kids Online Safety Act (KOSA) could lead to increased surveillance and censorship. Evan Greer, director of Fight for the Future, warned that age verification laws could result in massive online censorship of content under the guise of safety.
Stonelake, who once lobbied for KOSA, has also become a critic of its current version. She specifically highlighted the bill’s “preemption clauses,” which would override state regulations and potentially block lawsuits from school districts, bereaved families, and states—including the current case brought by New Mexico against Meta.
Stonelake emphasized that the solution must be complex and nuanced, rather than relying on legislative language that simplifies the issue to “rile up” political factions.
