TDIn a groundbreaking decision that could redefine the accountability of social media companies, a New Mexico jury has ruled that Meta Platforms Inc. violated state consumer protection laws by misleading the public about the safety of its platforms for children.
The verdict, delivered after nearly seven weeks of testimony, marks one of the most significant legal challenges yet to the tech giant’s practices.
The Verdict
- Jurors determined that Meta’s platforms — including Facebook, Instagram, and WhatsApp — pose serious risks to children’s mental health and wellbeing.
- The company was found guilty of engaging in “unconscionable trade practices” by exploiting the vulnerabilities of young users while publicly downplaying or concealing those risks.
- The ruling identified thousands of violations, resulting in a penalty of $375 million, a figure that underscores the scale of the alleged misconduct.
Key Allegations
- Concealed Risks: Evidence showed Meta executives were aware of dangers related to child sexual exploitation and mental health harms but failed to disclose them.
- Misleading Safety Claims: CEO Mark Zuckerberg and Instagram head Adam Mosseri were accused of making false assurances about platform safety.
- Algorithmic Harm: Meta’s recommendation systems were alleged to amplify harmful content, including material linked to eating disorders, bullying, and teen suicide.

Trial Highlights
The trial began on February 9, 2026, and stretched nearly seven weeks.
Undercover investigators posed as children online, documenting instances of sexual solicitation and predatory behavior.
Whistleblowers testified about internal research showing the negative effects of Instagram on teen girls’ self-esteem.
Psychiatric experts described rising rates of anxiety, depression, and classroom disruptions tied directly to social media use.
Educators recounted how platforms have eroded attention spans and contributed to behavioral issues among students.
Wider Context
This case is part of a broader wave of litigation against social media companies, with parallel lawsuits unfolding in California and more than 40 other states.
Attorneys general nationwide argue that Meta has fueled a youth mental health crisis by prioritizing engagement and profit over safety.
The ruling challenges long-standing protections under Section 230 of the Communications Decency Act, which has historically shielded tech companies from liability for user-generated content.
Legal experts suggest this verdict could open the door to stricter regulations and new standards for corporate responsibility in the digital age.
What’s Next
A second phase of the trial, scheduled for May 2026, will determine whether Meta’s conduct constitutes a public nuisance.
If found liable, Meta could be forced to implement sweeping safety reforms, fund mental health initiatives, or face further financial penalties.
The outcome may set precedent for how courts across the U.S. handle similar cases, potentially reshaping the legal landscape for Big Tech.
Significance
This verdict represents a pivotal moment in the ongoing debate over the role of social media in society.
For parents, educators, and policymakers, it signals a growing recognition of the harms digital platforms can inflict on children.
For Meta and other tech giants, it raises urgent questions about transparency and accountability.
It also brings to the fore the issue of balance between innovation and public safety.
The ruling in New Mexico may well become a touchstone in the broader movement.
The aim is to hold technology companies responsible for the social consequences of their products — a battle that is only just beginning.












