New Mexico Jury Hits Meta for $375 Million After Child Safety Case Exposes What Big Tech Knew
A New Mexico jury found Meta willfully violated state law and ordered the tech giant to pay $375 million after a child safety trial that exposed how badly the company failed families.
Meta just got handed a $375 million lesson in what happens when Big Tech spends years insisting everything is under control while parents keep finding out it very much is not. A New Mexico jury ruled this week that the company violated state law by failing to protect children from predators on Facebook and Instagram. That is not a bad headline for the state's attorney general. It is a brutal one for Mark Zuckerberg's empire.
According to CNBC and Breitbart's reporting on the case, jurors found Meta willfully violated New Mexico's unfair practices law after a trial centered on whether the company misled users about safety on its platforms. Translation: the jury did not buy the polished corporate line.
How New Mexico Built the Case
New Mexico Attorney General Raul Torrez sued Meta in 2023 after an undercover operation involving a fake profile for a 13-year-old girl. The state's allegation was straightforward and ugly. Once that account went live, it was quickly flooded with sexual content and targeted solicitations from predators.
Because of course it was.
The state's lawyers argued that Meta did not just miss a few bad actors. They argued the company built and maintained a system that failed children while publicly presenting itself as a responsible steward of online safety. Jurors apparently agreed.
"The jury's verdict is a historic victory for every child and family who has paid the price for Meta's choice to put profits over kids' safety," Torrez said in a statement reported by CNBC and Breitbart.
That is the kind of line prosecutors use when they know the facts landed.
The Number That Matters
New Mexico reportedly asked for penalties that could have topped $2 billion. The jury came back with $375 million. That is lower than the state's maximum ask, but let's not pretend this is pocket change. For a state-level child safety case, it is a thunderclap.
Here is why this matters beyond one courtroom in Santa Fe:
A jury found Meta willfully violated state law
The damages total reached $375 million
The case survived Section 230 defenses that usually shield tech platforms
A second phase could still force platform changes and additional remedies
That last part matters. The money hurts. The precedent may hurt more.
What the Evidence Suggested
The trial did not happen in a vacuum. Earlier reporting tied to the same litigation showed internal Meta testing raising serious questions about child safety controls. Breitbart, citing court testimony and Axios reporting, noted that an unreleased Meta chatbot product allegedly failed child sexual exploitation safety checks in roughly 66.8 percent of test scenarios.
If your own internal testing is throwing up numbers like that, maybe the correct move is not another corporate statement about how seriously you take safety.
Prosecutors also highlighted internal concerns about encrypted communications and how design choices could limit the reporting of child sexual abuse material to law enforcement. That gets to the heart of the broader fight: are these companies neutral platforms, or are they making design decisions that predictably make abuse easier to hide?
Reasonable people can debate where the legal lines should be drawn. Parents are not debating whether the problem is real.
Meta's Defense
Meta says it will appeal. The company told Breitbart and CNBC it disagrees with the verdict and insists it works hard to keep teens safe online.
"We respectfully disagree with the verdict and will appeal," a Meta spokesperson said. "We work hard to keep people safe on our platforms."
That is the standard script. Work hard. Take safety seriously. Constantly improving. You have heard it before.
The problem for Meta is that juries tend to notice when public messaging sounds better than the underlying facts. If New Mexico convinced twelve people that the company knew more than it admitted and did less than it should have, other states are going to notice.
Why Conservatives Should Pay Attention
This story is not about cheering bigger government or pretending blue-state prosecutors suddenly became heroes. It is about accountability in a sector that has spent years acting untouchable.
Conservatives have warned for a long time that massive institutions, whether corporate or governmental, stop behaving responsibly when nobody believes they can be checked. Big Tech has enjoyed that luxury for years. Meanwhile, families have been told to trust the same platforms that keep failing at the most basic job imaginable: keeping predators away from children.
And that is where this case gets interesting. New Mexico is not just chasing cash. The next phase of the trial could push for practical changes such as stronger age verification, removing predators more aggressively, and limiting features that shield bad actors from law enforcement scrutiny.
Who could possibly object to making it harder for predators to find kids online? Well, companies whose business models depend on frictionless growth, endless engagement, and as little liability as possible.
What Comes Next
A judge will now handle the trial's second phase starting in May, including whether Meta created a public nuisance and whether it should be forced to fund programs addressing the harm. That means this fight is not over.
Not even close.
If other states follow New Mexico's playbook, Big Tech may finally discover that "trust us" is not a legal strategy. And parents who have spent years being told the apps are safe may finally get the accountability they were promised but never saw.
That is not censorship. That is consequences.

