This revelation forms a critical component of the ongoing lawsuit, K.G.M. v. Platforms et al., where a jury is tasked with determining the extent to which social media companies bear responsibility for youth mental health issues allegedly stemming from their platforms’ design and addictive qualities. The focus on "time-spent" metrics underscores the core contention of the plaintiffs: that platforms like Instagram intentionally engineered their services to maximize user engagement, potentially at the expense of vulnerable young users’ well-being. Mark Zuckerberg’s appearance marked a rare instance of him testifying before a jury, highlighting the gravity of the proceedings and the significant legal challenges facing Meta.
The Genesis of the Lawsuit: K.G.M. v. Platforms et al.
The lawsuit, unfolding in L.A. County’s Superior Court, pits individual plaintiffs against major social media entities. While Snap and TikTok opted to settle out of court prior to the commencement of the trial, Meta (the parent company of Instagram) and YouTube remain key defendants, with their executives providing testimony. The plaintiff in this particular case, identified by the initials K.G.M. or "Kaley," is a 19-year-old who alleges that her early and prolonged exposure to social media profoundly damaged her mental health. Her claims include developing an addiction to the technology, experiencing severe depression, and grappling with suicidal ideation, all directly attributed to her use of these platforms.
Kaley’s legal team is endeavoring to establish that Meta not only set internal objectives to escalate the time users spent on Instagram but did so with full awareness that minors, including those under the platform’s stated age minimum, were actively using the app. This strategy aims to demonstrate a direct link between corporate growth strategies and adverse effects on young users. The legal battle represents a broader societal reckoning with the influence of digital platforms on adolescent development and mental health, pushing for a re-evaluation of corporate responsibility in the digital age.
Internal Metrics and the Pursuit of Engagement
The internal documents presented during Zuckerberg’s testimony shed light on Instagram’s meticulous tracking of user engagement. The progression from an average of 40 minutes per day in 2023 to 46 minutes per day in 2026, projected or actualized, signifies a consistent upward trend in daily usage. For social media companies, "time spent" is a paramount metric, directly correlating with advertising revenue and overall platform vitality. Higher engagement translates to more opportunities to display advertisements, gather data, and retain market share. The term "milestones" used by company executives to describe these usage figures suggests a celebration of increased user stickiness, which plaintiffs argue reflects an intentional design strategy to maximize engagement, even among younger users.
This emphasis on engagement aligns with the widely recognized business models of free-to-use social media platforms, which monetize user attention. Algorithms are constantly refined to deliver content that keeps users scrolling, clicking, and interacting for longer periods. The legal challenge here is to determine whether the pursuit of these engagement metrics crossed a line into creating intentionally addictive products that are harmful, particularly to developing minds. The plaintiffs contend that the platforms’ very architecture, from infinite scroll to notification systems and personalized content feeds, is engineered for maximum retention, creating a feedback loop that can be difficult for young users to disengage from.
Underage Users and Age Verification Challenges
A particularly contentious point in the trial revolves around Instagram’s knowledge of underage users on its platform. During his 2024 testimony before Congress, Zuckerberg stated that children under 13 were not permitted on Instagram, aligning with the company’s official policy. However, internal documents referenced by the plaintiff’s lawyers painted a different picture, indicating that as early as 2015, Instagram was aware of approximately 4 million children under 13 actively using the app. This figure was further contextualized by the staggering statistic that it represented 30% of all 10- to 12-year-olds in the U.S. at the time.
Zuckerberg countered this line of questioning by asserting that his congressional testimony accurately reflected the company’s policy and that Instagram actively removed underage users it identified. He also attempted to draw a distinction between "milestones" — which he described as tracked usage figures — and "goals," which he defined as specific targets set for the Instagram team. This nuanced defense aims to decouple the observation of user behavior from an explicit corporate mandate to attract and retain underage users.
However, further internal communications presented by the plaintiff’s legal team challenged this distinction. Emails from a former product manager explicitly stated, "Our overall company goal is total teen time spent," and notably, "Mark has decided that the top priority for the company in the first half of 2017 is teens." Another market analysis from December 2018 identified tweens as the "highest retention age group" in the U.S., suggesting a clear corporate interest in this demographic. These documents appear to contradict the notion that the presence and engagement of young users were merely observed "milestones" rather than strategic "goals."
The challenge of age verification on social media platforms is a long-standing issue. An email from Nick Clegg, a former Zuckerberg adviser and Meta policy chief, underscored this difficulty, noting that Instagram’s age requirements were "unenforceable." Despite this acknowledged vulnerability, Instagram did not implement a mandatory birthday entry requirement for existing users until August 2021. While Meta clarified that it began asking for ages at sign-up for new users in 2019, the delay in addressing the existing underage user base is a significant point of contention for the plaintiffs. This timeline highlights the slow pace of implementing protective measures even when internal awareness of the problem was evident.
Meta’s Defense and Broader Context
Meta, through its spokesperson Stephanie Otway, staunchly disputes the direct causality between its app and Kaley’s mental health struggles. Otway stated, "The question for the jury in Los Angeles is whether Instagram was a substantial factor in the plaintiff’s mental health struggles. The evidence will show she faced many significant, difficult challenges well before she ever used social media." This defense strategy seeks to introduce alternative explanations for Kaley’s difficulties, suggesting that pre-existing conditions or other life circumstances were primary contributors to her mental health issues, thereby mitigating Instagram’s responsibility.
This approach is common in product liability cases, where defendants often argue that multiple factors contribute to an outcome, making it difficult to isolate the impact of a single product. However, the plaintiffs are attempting to demonstrate that even if other factors were present, Instagram’s design and deliberate targeting of young users constituted a "substantial factor" in exacerbating or causing harm.
The lawsuit against Meta is part of a growing wave of legal and regulatory actions globally aimed at holding social media companies accountable for the perceived harms of their platforms. Governments, advocacy groups, and parents are increasingly scrutinizing the impact of social media on youth mental health, citing concerns over cyberbullying, body image issues, sleep deprivation, and the potential for addiction. Reports from organizations like the U.S. Surgeon General have highlighted the urgent need for action to protect youth online, pointing to alarming trends in adolescent depression and anxiety that coincide with the rise of pervasive social media use.
The Strategic Pursuit of the Youth Demographic
Despite the ongoing legal battles and increased scrutiny, Instagram’s strategic focus on young demographics remains evident. Internal documentation referenced during the testimony indicated Meta’s current ambition for Instagram to become the largest "teen destination" globally and in the U.S. by monthly active users this year. This aggressive pursuit of the youth market, even amidst accusations of harm, underscores the perceived long-term value of cultivating a young user base for future growth and market dominance.
The rationale behind targeting younger users is multifaceted. Early adoption often leads to greater loyalty and a longer user lifecycle. Furthermore, trends originating within younger demographics frequently propagate to older users, making them key influencers in digital culture. However, this strategy is now under intense ethical and legal examination, particularly concerning the developmental vulnerability of adolescents. Their brains are still maturing, making them potentially more susceptible to the psychological mechanisms designed to maximize engagement, such as intermittent rewards, social validation through likes and comments, and the fear of missing out (FOMO).
In response to public pressure and regulatory threats, Instagram has, in recent years, rolled out various "teen protections" and "parental controls." These include features like daily time limits, notifications to take breaks, and tools for parents to monitor their children’s activity. While these measures are presented as efforts to promote digital well-being, critics argue they are often reactive and insufficient, coming years after the alleged harms have already manifested. The lawsuit aims to prove that these protective measures were implemented belatedly, only after the company had already benefited from the extensive engagement of underage users.
Implications for Corporate Accountability and Digital Regulation
The outcome of K.G.M. v. Platforms et al. holds significant implications not only for Meta but for the entire social media industry. A verdict in favor of the plaintiff could set a powerful legal precedent, potentially opening the floodgates for similar lawsuits and increasing the pressure on tech companies to fundamentally redesign their products with user well-being, rather than just engagement, as a primary consideration. It could force a re-evaluation of how algorithms are designed, how age verification is enforced, and how transparent companies are about their internal metrics and their knowledge of user demographics.
Beyond the courtroom, the trial is likely to intensify calls for stronger legislative action and regulatory oversight. Policymakers worldwide are grappling with how to effectively govern the digital sphere, balance free speech with user safety, and protect vulnerable populations, especially children and adolescents. Discussions around age-appropriate design codes, limitations on algorithmic amplification for minors, and mandatory digital literacy programs are gaining traction.
Ultimately, this lawsuit represents a crucial moment in the ongoing debate about corporate accountability in the digital age. It challenges the long-held notion that tech companies are merely neutral platforms, pushing for a recognition of their active role in shaping user experiences and, by extension, societal well-being. The jury’s decision will not only impact Kaley and Meta but could also help redefine the ethical responsibilities of technology giants and the future landscape of digital interaction for generations to come.
If you or someone you know is considering suicide or needs to talk, there are people who want to help. Call or text 988 to reach the National Suicide Prevention Lifeline.
This article was corrected after publication to note that it is not Zuckerberg’s first appearance before a jury, as he previously appeared in a trial focused on Meta’s VR technology.
