March 3, 2026
Instagram’s Internal Metrics Reveal Deep Tracking of Youth Usage Amid Landmark Mental Health Lawsuit

Instagram’s Internal Metrics Reveal Deep Tracking of Youth Usage Amid Landmark Mental Health Lawsuit

Internal documents unsealed during a high-stakes state court proceeding in Los Angeles County Superior Court in February have revealed that Instagram, a subsidiary of Meta Platforms, meticulously tracked the daily usage time of its users, with company executives celebrating "milestones" in app engagement year after year. These revelations came to light during the testimony of Meta CEO Mark Zuckerberg in the significant case of K.G.M. v. Platforms et al. The documents indicate a concerning trend of increasing daily usage, from an average of 40 minutes per day in 2023 to a projected 46 minutes per day by 2026, intensifying scrutiny on the platform’s design and its alleged impact on youth mental health.

The focus on these granular time-spent metrics forms a central pillar of the plaintiffs’ arguments in the ongoing trial, which represents one of Zuckerberg’s rare appearances before a jury. The lawsuit, now unfolding in L.A. County’s Superior Court, seeks to determine whether social media companies bear liability for the mental health issues experienced by young users, purportedly caused by the addictive designs and inherent features of their platforms. While Snap and TikTok opted to settle out of court prior to the commencement of the trial, executives from the remaining defendants, Meta and Google-owned YouTube, are currently providing testimony as the legal proceedings advance.

The Plaintiff’s Allegations and Meta’s Defense

The 19-year-old plaintiff in the case, identified by the initials K.G.M. or "Kaley," alleges that her extensive use of social media from a young age severely damaged her mental health. She claims to have developed an addiction to the technology, which subsequently led to depression and, tragically, suicidal ideations. Her legal team is endeavoring to demonstrate that Meta, despite being aware of a significant minor presence on its platform, deliberately set internal objectives to increase user engagement and time spent on Instagram. This pursuit of engagement, they argue, came at the expense of young users’ well-being.

Meta, however, staunchly refutes the assertion that its application is directly responsible for Kaley’s struggles. In a statement emailed to the press, Meta spokesperson Stephanie Otway articulated the company’s position: "The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff’s mental health struggles. The evidence will show she faced many significant, difficult challenges well before she ever used social media." This defense strategy aims to shift focus away from the platform’s design and toward other potential contributing factors in the plaintiff’s life.

Zuckerberg’s Testimony: Policy vs. Internal Reality

A critical juncture in the trial involved the direct questioning of Mark Zuckerberg regarding previous statements he made to Congress. In 2024, Zuckerberg had asserted that children under 13 were not permitted on Instagram. This public declaration was directly challenged by internal company documents presented by the plaintiff’s legal team, which shockingly revealed that as far back as 2015, Instagram was aware of approximately 4 million children under the age of 13 actively using the app. The document further underscored the significance of this figure, noting that it represented a substantial 30% of all 10- to 12-year-olds in the United States at the time.

During his testimony, Zuckerberg pushed back against this line of questioning. He maintained that his statement to Congress accurately reflected the company’s stated policy at the time. He further clarified that Instagram took steps to remove underage users when they were identified on the platform. Zuckerberg also attempted to draw a distinction between the "milestones" that the company tracked regarding user engagement and specific "goals" that Instagram’s development teams were explicitly tasked to achieve, implying that tracking engagement did not equate to actively encouraging underage use or addiction.

However, other internal documents referenced by the plaintiff’s legal team during his testimony painted a different picture, suggesting a keen and growing interest within Instagram in the tween and teen demographic. Emails from a former product manager were quoted, explicitly stating, "Our overall company goal is total teen time spent," and further emphasizing, "Mark has decided that the top priority for the company in the first half of 2017 is teens." Another market landscape analysis from December 2018 identified tweens as the "highest retention age group" in the U.S., a finding that could be interpreted as highlighting the demographic’s significant value to the company’s long-term growth strategies. These documents collectively suggest a deliberate, sustained focus on attracting and retaining younger users, irrespective of stated age policies.

Adding further weight to the plaintiffs’ arguments, an email penned by Nick Clegg, a former Zuckerberg adviser who departed the company last year, was presented. In this communication, Clegg candidly pointed out that Instagram’s age requirements were "basically unenforceable." This internal acknowledgment of the practical limitations of age verification policies directly supports the plaintiffs’ claim that Meta was aware of underage users and the challenges in preventing their access, even while publicly stating a different policy.

Chronology of Age Verification and Platform Engagement

The timeline of Instagram’s actions regarding age verification is a critical component of the lawsuit’s narrative. Despite the internal knowledge of a substantial underage user base, the platform’s response was notably delayed. The plaintiff’s lawyers argued that Instagram did not implement significant measures to address its existing underage users until August 2021, when it began requiring all users to enter their birthdays. Meta countered this point, asserting that it had initiated the practice of asking for ages at sign-up for new users as early as 2019. This distinction is crucial, as it highlights the difference between preventing new underage sign-ups and proactively identifying and removing existing ones.

The revelations surrounding Instagram’s tracking of user engagement and its specific focus on young demographics trace back over several years:

  • 2015: Internal documents show Instagram was aware of approximately 4 million children under 13 using the app, representing 30% of 10- to 12-year-olds in the U.S.
  • 2017: An internal email explicitly states, "Our overall company goal is total teen time spent," and indicates that Mark Zuckerberg designated teens as the "top priority for the company in the first half of 2017."
  • 2018: A market landscape analysis identifies tweens as the "highest retention age group" in the U.S., signaling their importance for sustained user engagement.
  • 2019: Meta began asking new users for their age during the sign-up process.
  • August 2021: Instagram implemented a policy requiring all users, including existing ones, to provide their birthday information.
  • 2023: Internal metrics show average daily usage at 40 minutes per user.
  • 2024: Mark Zuckerberg testified before Congress, stating that children under 13 were not allowed on Instagram, a statement now contrasted by earlier internal documents.
  • February 2024: Zuckerberg’s testimony in the K.G.M. v. Platforms et al. trial reveals the internal documents.
  • 2026 (Projected): Internal metrics project average daily usage to increase to 46 minutes per user.

Broader Implications and the Attention Economy

The K.G.M. v. Platforms et al. lawsuit extends beyond the individual case of Kaley; it represents a broader societal reckoning with the influence of social media on mental well-being, particularly among adolescents. The legal proceedings underscore the ongoing debate about corporate responsibility in designing platforms that maximize engagement, often through sophisticated algorithms and psychological triggers, and the potential unintended consequences for vulnerable user groups.

The "attention economy," where platforms compete fiercely for users’ time and attention, forms the backdrop for these allegations. Critics argue that the business models of social media companies inherently incentivize features that can foster compulsive use, blurring the lines between healthy engagement and addiction. The internal documents revealing Instagram’s precise tracking of daily usage and its strategic focus on increasing "teen time spent" provide tangible evidence for these arguments, suggesting that maximizing engagement was a deliberate objective, potentially without fully mitigating the risks for young users.

This lawsuit is not isolated. It is part of a growing wave of legal challenges and regulatory scrutiny faced by tech giants concerning their practices with minors. State attorneys general across the U.S. have initiated investigations and filed lawsuits against Meta and other platforms, alleging that they knowingly designed addictive features harmful to children. Congress has also explored legislative solutions, such as the Kids Online Safety Act (KOSA), aimed at protecting minors online. The outcome of the K.G.M. v. Platforms et al. trial could set a significant precedent, potentially influencing future litigation and regulatory frameworks for social media companies.

Meta’s Evolving Stance and Future Goals

In recent years, likely in response to escalating public pressure and regulatory threats, Instagram has rolled out a series of teen protections and parental controls. These include features like daily time limits, parental supervision tools, and stricter privacy settings for minors. However, the internal documentation referenced in Zuckerberg’s testimony indicates that, despite these measures, Meta’s fundamental strategic interest in the young demographic persists. The company’s current ambition, as revealed in these documents, is for Instagram to become the largest teen destination by monthly active users in the U.S. and globally within the current year. This ongoing strategic focus on attracting and retaining young users highlights the complex tension between platform growth objectives and the imperative to ensure user safety and well-being.

The trial’s resolution will have far-reaching implications, not only for Meta but for the entire social media industry. It will test the legal boundaries of corporate liability for the psychological impacts of digital products and could redefine the ethical obligations of platforms towards their youngest users. As the jury deliberates, the world watches to see if the internal pursuit of engagement metrics will be deemed a substantial factor in the mental health struggles of a generation growing up online.

If you or someone you know is considering suicide or needs to talk, there are people who want to help. Call or text 988 to reach the National Suicide Prevention Lifeline.

Leave a Reply

Your email address will not be published. Required fields are marked *