A Santa Fe jury on Tuesday delivered a historic verdict, ordering Meta Platforms, Inc. to pay $375 million in civil penalties. The ruling found the tech giant liable for misleading consumers about the safety of its social media platforms, Facebook and Instagram, and for endangering children. This decision marks a pivotal moment, being the first jury verdict of its kind against Meta specifically addressing harm to young people.
New Mexico Attorney General Raúl Torrez’s office swiftly hailed the outcome as a "watershed moment for every parent concerned about what could happen to their kids when they go online," according to a press release issued immediately after the verdict. The six-week trial concluded with the jury finding Meta responsible on both claims brought by the state under its Unfair Practices Act. The penalty of $5,000 per violation, the maximum allowed under state law, though seemingly modest for a company valued at $1.5 trillion by public market investors, carries immense symbolic weight due to its unprecedented nature.
“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew,” Torrez stated, emphasizing the deliberate nature of the company’s actions. “Today the jury joined families, educators, and child safety experts in saying enough is enough.” The verdict underscores a growing societal demand for accountability from social media companies regarding the well-being of their youngest users.
The Genesis of the Lawsuit: An Undercover Operation
The New Mexico case against Meta originated from a meticulously planned undercover investigation launched by the Attorney General’s office in 2023. State investigators created decoy accounts on Facebook and Instagram, carefully crafting profiles to appear as users younger than 14 years old. The operation quickly revealed alarming vulnerabilities: these underage-presenting accounts were targeted with sexually explicit material and solicited for sex by several individuals.
The investigation culminated in May 2024 with the arrest of several New Mexico men. Two of these individuals were apprehended at a motel, where they believed they were meeting a 12-year-old girl based on conversations initiated and sustained through the social media platforms. This shocking evidence formed the bedrock of the state’s case, illustrating a direct link between the platforms’ functionalities and the facilitation of child exploitation.
Beyond the specific incidents, the trial presented a broader indictment of Meta’s corporate practices. Evidence introduced by the state included internal Meta documents and compelling testimony from former employees, which collectively demonstrated a troubling pattern. Company staff and external child safety experts had repeatedly raised alarms about inherent dangers on the platforms, particularly concerning minors. These warnings, the prosecution argued, were largely ignored or downplayed by Meta’s leadership, prioritizing engagement and growth over user safety.
Damning Internal Testimonies and Corporate Knowledge
Some of the most impactful evidence presented during the trial came from individuals who had worked within Meta and witnessed its internal operations firsthand. Their testimonies painted a picture of a company aware of potential harms but allegedly slow to act.
Arturo Bejar, a former engineering and product leader who spent six years at Meta starting in 2009, provided particularly damaging testimony. He had previously testified before the U.S. Senate, warning lawmakers about the dangers embedded in social media platforms. In the Santa Fe courtroom, Bejar recounted his personal and professional efforts to alert Meta executives after his own 14-year-old daughter received unwanted sexual advances on Instagram. He further explained how the very personalized algorithms that make Meta’s platforms highly effective at targeting advertisements could be equally exploited by predators. “The product is very good at connecting people with interests,” Bejar asserted, “and if your interest is little girls, it will be really good at connecting you with little girls.” This statement highlighted a critical design flaw: the neutrality of algorithms, which can amplify both benign and malicious connections.
Another former Meta executive, Brian Boland, who served as a vice president of partnerships product marketing for nearly a dozen years, reinforced this narrative. Boland testified that upon his departure from the company in 2020, he “absolutely did not believe that safety was a priority” for CEO Mark Zuckerberg and then-COO Sheryl Sandberg. This insider perspective suggested that safety concerns were not given the necessary prominence at the highest levels of the organization, despite internal and external warnings.
Mark Zuckerberg’s Deposition: A Glimpse into Leadership’s Stance
A significant moment in the trial involved the presentation of a recorded deposition from Meta CEO Mark Zuckerberg. Taken a year prior to the trial, the recording was shown to jurors earlier this month, offering a direct insight into the company’s leadership perspective on platform safety and addiction.
During the deposition, Zuckerberg characterized research on whether Meta’s platforms are addictive as “inconclusive.” This statement faced strong pushback from the state’s legal team, which countered by referencing Meta’s own internal researchers. These researchers, the prosecution argued, had found that several product features were specifically designed to elicit dopamine responses and increase the time users spent on the applications – a clear indicator of addictive design principles.
When pressed on whether he, as a parent, had a right to know if a product his own child was using was addictive, Zuckerberg responded that there was “a lot to unpack in that.” He then elaborated on his personal approach, stating that he and his wife meticulously investigate whether products are “good to use” before allowing their children to engage with them, and that they “also oversee how they’re used.” He noted, tellingly, that his children are “younger,” implicitly suggesting that concerns might evolve as they age. This testimony, perceived by many as defensive, highlighted a potential disconnect between the company’s public safety assurances and its internal operational priorities.
Meta’s Response and the Broader Legal Landscape
Unsurprisingly, Meta has indicated its intention to appeal the Santa Fe verdict. A company spokesperson, in a statement to media outlets, expressed disagreement with the outcome, asserting that Meta “works hard to keep people safe” on its platforms. This response is consistent with previous corporate reactions to legal challenges, typically involving a robust defense of their practices and an appeal of adverse rulings.
However, the New Mexico case is far from an isolated incident; it is but one battle in a growing legal warfront for Meta. The company, alongside YouTube, is currently embroiled in another high-profile trial in Los Angeles. This separate lawsuit centers on claims that their platforms are inherently addictive and have caused significant harm to young users. The plaintiff in this case, identified only as K.G.M., is a 20-year-old California woman who alleges she developed a social media addiction as a child, leading to anxiety, depression, and severe body-image issues. Notably, TikTok and Snap were initially defendants in this case but settled before the trial commenced, signaling a broader industry acknowledgment of these liabilities.
The Los Angeles jury is currently deliberating, with a verdict anticipated soon. On Monday, the presiding judge instructed the jurors to continue their deliberations after they indicated difficulties reaching a consensus on one of the defendants, raising the specter of a partial retrial if a unanimous decision cannot be achieved.
Adding to Meta’s legal woes, a second phase of the New Mexico case is already scheduled. A bench trial, meaning it will proceed without a jury, is set to begin on May 4. This phase will address public nuisance claims against Meta, potentially leading to further penalties. More significantly, it could result in court-mandated changes to Meta’s platforms, including stringent age verification requirements and the implementation of new, enhanced protections specifically designed for minors. Rather than arguing that Meta violated a specific consumer protection law, this phase of the trial will contend that the company’s platforms have broadly and detrimentally impacted the health and safety of New Mexico residents, a legal strategy that could have far-reaching implications for how social media companies operate.
Legal Precedent and Broader Industry Implications
The New Mexico verdict sets a significant legal precedent. As the first jury decision finding Meta liable for harms to young people, it opens the door for similar lawsuits across other states and jurisdictions. State attorneys general, often at the forefront of consumer protection, will likely view this success as a blueprint for challenging tech companies on similar grounds. The application of the Unfair Practices Act to platform design and its impact on minors could become a powerful tool in future litigation, shifting the focus from individual user responsibility to corporate accountability.
For Meta and other social media giants, this verdict amplifies the pressure to fundamentally reassess their product designs and safety protocols. The financial penalty, while not crippling for a company of Meta’s size, is less significant than the reputational damage and the legal precedent established. Investors, already wary of increasing regulatory scrutiny and public backlash, may demand more robust commitments to user safety, particularly for vulnerable populations like children and adolescents. The long-term impact could include a re-evaluation of business models that prioritize engagement metrics over user well-being.
Social Media and Youth Mental Health: A Growing Concern
The legal battles against Meta unfold against a backdrop of escalating public and scientific concern over the effects of social media on youth mental health. Numerous studies and reports have highlighted a correlation between heavy social media use and increased rates of anxiety, depression, body image issues, sleep deprivation, and cyberbullying among adolescents. For instance, data from the Centers for Disease Control and Prevention (CDC) and various academic institutions consistently point to a mental health crisis among youth, with social media often cited as a contributing factor. While the exact causal links are still debated and researched, the evidence suggests a significant and growing problem that parents, educators, and policymakers are struggling to address.
The role of algorithms, as highlighted by Arturo Bejar’s testimony, is central to this discussion. These sophisticated systems are designed to maximize user engagement by serving personalized content, often without adequately distinguishing between beneficial and harmful material. For young users, whose brains are still developing and who are particularly susceptible to peer pressure and external validation, this algorithmic amplification can be especially detrimental. The New Mexico case’s focus on misleading consumers about safety directly addresses this design philosophy, arguing that companies have a responsibility to design products that do not inherently put vulnerable users at risk.
Regulatory Scrutiny and Calls for Action
The New Mexico verdict is also a reflection of a broader, intensified regulatory environment. Governments worldwide, and particularly in the United States, are increasingly scrutinizing the tech industry. Legislative efforts like the proposed Kids Online Safety Act (KOSA) in the U.S. Senate aim to mandate safeguards for minors online, including age-appropriate design and restrictions on harmful content. State-level legislation, often more agile than federal efforts, is also emerging to address issues ranging from data privacy to platform addiction.
Child safety advocates and non-profit organizations have been vocal proponents of stronger regulations and greater corporate responsibility. They argue that self-regulation by tech companies has proven insufficient, necessitating external mandates to protect children. The ethical responsibilities of tech companies are being redefined, moving beyond mere content moderation to encompass fundamental product design and algorithmic accountability. This verdict serves as a powerful validation of their long-standing advocacy.
In conclusion, the $375 million verdict against Meta in New Mexico is more than just a financial penalty; it is a landmark legal victory that redefines the parameters of corporate accountability in the digital age. By holding Meta liable for misleading consumers and endangering children, the jury has sent an unequivocal message to the entire social media industry: the well-being of young users can no longer be sidelined in the pursuit of engagement and profit. As Meta prepares its appeal and faces ongoing legal battles, this verdict promises to catalyze further litigation, regulatory action, and, ultimately, a significant re-evaluation of how social media platforms are designed and governed to protect the most vulnerable members of society.
