Meta ordered to pay $375 million after New Mexico jury rules platform endangered minors
A New Mexico jury has handed down one of the largest verdicts ever against a social media company, ordering Meta to pay $375 million after finding the company liable for misleading users about safety on Facebook and Instagram and enabling the sexual exploitation of minors on its platforms. The verdict, reached in late March 2026, sends a clear signal that courts are willing to hold tech giants financially accountable for what happens on their platforms, not just what they say about it.
The lawsuit was brought by the state of New Mexico, and the case built on evidence that Meta knew its platforms were being used to contact and exploit children. Internal documents presented during trial showed that company employees had raised concerns about predatory behavior on the platforms years before any significant safety measures were put in place. The jury found that Meta not only failed to act on those warnings but actively misled the public about how safe its products were for young users.
What the trial actually showed
At the center of the case was a familiar tension: growth targets versus user safety. Meta's own research, which the company had largely kept internal, showed that Instagram in particular was a frequent contact point between adult predators and minors. The recommendation algorithm, designed to maximize engagement, sometimes pushed minors toward accounts that had been flagged for suspicious activity. That was not a bug prosecutors had to theorize about. It was in the company's own data.
New Mexico's attorney general had filed the suit after a state investigation found that undercover accounts posing as minors were quickly connected with adults sending explicit messages. Facebook's own systems, rather than blocking those interactions, often accelerated them through friend and follow suggestions. The trial laid out a pattern: a platform aware of its risks, choosing not to fix them because doing so would reduce the engagement numbers that advertisers pay for.
Why $375 million matters beyond the number
For a company that generated over $160 billion in revenue in 2024, $375 million is not catastrophic on its own. What matters more is the precedent. Dozens of similar lawsuits are currently pending in other states, many of them using the same legal strategy: arguing that Meta violated consumer protection laws by making public safety claims that its internal data contradicted. If those cases go to trial and juries respond the way New Mexico's did, the cumulative liability could reach into the billions.
Legal analysts have noted that this verdict is particularly significant because it was reached by a jury, not a judge. Jury decisions in cases like this are harder to reverse on appeal and often carry more weight in settlement negotiations for related suits. Meta has said it will appeal, but the company is now negotiating from a weaker position than it was even a year ago.
Meta's response and what it actually changed
Meta issued a statement saying it has invested heavily in child safety tools, including default private accounts for users under 16, restrictions on who can message minors, and a program to remove sextortion content more quickly. The company pointed to a 2023 report showing it removed over 27 million pieces of child exploitation content from its platforms. Those numbers sound large. But critics, including several child safety organizations that testified during the trial, argued that the volume of removals itself reflects how widespread the problem had become, not how effectively it was being controlled.
Instagram's Teen Accounts feature, rolled out in late 2024, places automatic content restrictions on accounts belonging to users under 16 and limits messaging to approved contacts. It was presented by Meta as a meaningful step. Attorneys for New Mexico countered that the feature arrived years after the company first had internal evidence of widespread predatory contact, and that it still does not address the recommendation algorithm's role in connecting minors with bad actors.
The broader legal push against social media platforms
New Mexico is not alone. More than 40 state attorneys general have filed or joined lawsuits against Meta specifically over child safety. The federal government has also been pushing for legislation that would set stricter age verification requirements for social media platforms and impose data use restrictions for minors. The Kids Online Safety Act has passed the Senate but remained stalled in the House as of early 2026.
TikTok, Snapchat, and YouTube have faced similar suits, though Meta has been the most frequent target due in part to the scale of its platforms and the volume of internal documents that became public during various legal proceedings. The New Mexico verdict adds pressure on other platforms to settle pending cases rather than risk jury trials of their own.
What comes next
Meta's appeal will likely focus on whether the state's consumer protection claims were properly applied to a federally regulated communications platform, an argument that has had mixed results in other courts. The appeal process could take two or more years. In the meantime, the company faces ongoing discovery obligations in the other pending state cases, and any documents that emerge from those proceedings could further complicate its legal position.
For parents and advocates, the verdict is a concrete outcome after years of pushing for accountability that largely went nowhere in Congress. For state governments, it confirms that the legal path, while slower, is producing results. The next significant hearing is expected in a consolidated federal case later this year, where multiple states are pursuing coordinated claims using much of the same evidence that was effective in New Mexico.
AI Summary
Generate a summary with AI