New Mexico jury orders Meta to pay $375 million for failing to protect children from sexual predators

 March 25, 2026
category: 

A New Mexico jury found Meta violated state law by failing to protect children from sexual predators on its platforms and misleading users about their safety, ordering the tech giant to pay $375 million in civil penalties. The verdict landed Tuesday after six weeks of testimony in a case brought by New Mexico Attorney General Raúl Torrez.

Meta said it "respectfully disagreed with the verdict and will appeal." The company has 40,000 people working on safety, according to its own attorney. The jury was not impressed.

Growth Over Children

State attorney Linda Singer framed the case in terms that left little room for ambiguity. Addressing the jury Monday, she argued that the platform's failures weren't bugs in the system. They were features of the business model.

The safety issues that you've heard about in this case, weren't mistakes. They were a product of a corporate philosophy that chose growth and engagement over children's safety. And young people in this state and around the country have borne the cost.

Singer also told jurors that Meta "has failed over and over again to act honestly and transparently, failed to act to protect young people in this state."

The case centered on claims that Meta violated New Mexico's law barring unfair trade practices, the New York Post reported. Prosecutors argued the company failed to enforce its own claimed minimum age limit of 13, allowed its algorithm to connect predators with potential victims, and turned a blind eye to rampant exploitation on its platforms. New Mexico's investigation included a sting operation in which officials set up test accounts to probe the company's safety standards. Local police made at least three arrests in connection with the investigation.

The Algorithm Knows

Perhaps the most damning testimony came from someone who once worked inside Meta's own walls. Arturo Béjar, a former Meta safety researcher turned whistleblower, emotionally recounted that his then-14-year-old daughter received unsolicited explicit images shortly after creating her first Instagram account.

His description of Meta's recommendation engine was chilling in its simplicity:

The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls.

That's not an accusation from a hostile outsider. That's a former company researcher explaining, under oath, how the machine works. Unsealed court documents referenced an internal email warning of as many as 500,000 cases of online sexual exploitation per day. Half a million. Per day.

Meta's Defense: We're Trying Really Hard

Meta attorney Kevin Huff offered the jury the company's standard response: scale and effort. He told jurors that "Meta has built innovative, automated tools to protect people" and pointed to the company's workforce of 40,000 dedicated to platform safety. He called the $2 billion in penalties that New Mexico attorneys had originally sought "a shocking number."

A Meta spokesperson echoed the sentiment Tuesday, saying the company works "hard to keep people safe on our platforms" and acknowledging "the challenges of identifying and removing bad actors or harmful content."

The jury apparently found the gap between Meta's stated effort and its actual results more persuasive than the company's assurances. The $375 million penalty, while less than the $2 billion prosecutors sought, still represents a landmark figure in state-level enforcement against Big Tech.

The Pattern That Never Changes

There's a familiar rhythm to these stories. A tech company builds a product that prints money. Internal researchers raise alarms. The company buries the data, tweaks the talking points, and hires a few thousand more content moderators it can cite in court. Children get hurt. Lawsuits follow. The company expresses concern, points to its investments in safety, and promises to do better. Repeat.

Meta is simultaneously awaiting a jury's decision in a separate California state court case involving claims about social media addiction. The company denies wrongdoing there too.

What makes the New Mexico case noteworthy isn't just the dollar figure. It's the mechanism. A state attorney general used existing consumer protection law to hold a trillion-dollar company accountable for lying about the safety of its product. No new legislation required. No congressional hearings that go nowhere. Just a straightforward application of the principle that you can't sell people something while lying about what it does to them.

Accountability Shouldn't Be This Hard

Conservatives have been sounding the alarm on Big Tech's impact on children for years, often while being told they were overreacting or trying to censor the internet. This verdict vindicates that concern in the starkest possible terms. A jury of ordinary citizens reviewed six weeks of evidence and concluded that one of the most powerful companies on earth knowingly chose profit over the safety of minors.

The $375 million will barely dent Meta's balance sheet. The company knows that. Everyone knows that. But the precedent matters more than the penalty. Other state attorneys general are watching. Other juries will hear similar cases. The calculus that made it cheaper to settle, deflect, and move on is starting to shift.

Singer closed her argument to the jury with five words that carried the weight of the entire case: "It is up to you to finish this job."

They did their part. The question is whether anyone else will.

DON'T WAIT.

We publish the objective news, period. If you want the facts, then sign up below and join our movement for objective news:

TOP STORIES

Latest News