Landmark verdicts put Meta’s “addiction machine” platforms on trial

| March 26, 2026
Child on a phone screen

Meta faced two major legal setbacks this week as courts in New Mexico and California both found the company liable for harm to children.

A New Mexico jury just ordered Meta to pay $375 million for misleading parents about child safety on Instagram and Facebook. Jurors found the company violated consumer protection laws by claiming its platforms were safe while knowing they exposed children to danger.

A day later, a Los Angeles jury found Meta and Google liable in a landmark case over platform design. The case, brought by a young woman known as Kaley, accused both companies of getting her addicted to their products as a child, calling their platforms “addiction machines.”

New Mexico wins three-year lawsuit

New Mexico sued Meta in 2023 for violating its Unfair Practices Act, claiming the company’s algorithms were pushing sexual content to kids.

Prosecutors said this wasn’t random. They argued that Meta’s algorithms steered kids towards explicit content. The complaint said that Meta had:

“Proactively served and directed them to a stream of egregious, sexually explicit images through recommended users and posts—even where the child has expressed no interest in this content.”

The lawsuit also alleged that the platform made it easier for adults to contact and exploit minors, including grooming and solicitation.

During the seven-week trial, jurors saw internal memos and heard from several witnesses including Arturo Béjar, a software engineer who quit the company in 2021. He said a stranger propositioned his young daughter on Instagram.

Meta’s internal research presented in court showed that 16% of Instagram users saw unwanted nudity or sexual content in a single week. Documents said Meta knew about the harm.

When announcing the legal win, New Mexico Attorney General Raúl Torrez said:

“Meta executives knew their products harmed children, disregarded warnings from their own employees, and lied to the public about what they knew. Today the jury joined families, educators, and child safety experts in saying enough is enough.”

New Mexico prosecutors also found employee messages discussing how Mark Zuckerberg’s 2019 announcement of end-to-end encryption on Facebook Messenger would hamper their ability to catch predators. Meta has said it plans to remove end-to-end encryption from Instagram private messages, a move linked to ongoing concerns about detecting abuse on the platform.

Meta’s lawyers said that the company was protecting kids and removing harmful content. The company offers Teen Account protections and parental alerts. Still, they acknowledged that harmful content can slip through. 

The $375m figure in the New Mexico case was calculated from thousands of individual violations, each counting separately toward the penalty, and Meta is set to appeal.

In the LA case, jurors recommended $3m in compensatory damages to Kaley, along with $3m in punitive damages. Both Meta and Google “acted with malice, oppression, or fraud” in their platform operations, the jury found.

On its own, even the $375m penalty is not especially financially damaging to Meta, which made just over $60bn in net income last year. But these two cases are the first of many forthcoming legal challenges the company will face.

The Kaley case was the first of several “bellwether” cases, which are trials that could set the pace for hundreds or thousands of similar cases. Over 2,400 cases making similar claims against Meta have been consolidated in California. The next bellwether case, RKC vs Meta, will begin in the summer.

Dozens of state attorneys general have also sued Meta, accusing it of deliberately designing its platform with addictive properties that harm young people.

The scrutiny of the algorithm in both cases might also make it more difficult in future for big tech companies to rely on Section 230. The 30-year-old legislation has long protected tech platforms from the actions of users on its platform. That didn’t protect Meta from criticism over how it engineered its own platform.

Beyond the potential for much larger penalties, these cases are important because state legislators have legally shown Meta’s platforms knew about harm to children while telling parents everything was fine. That could be especially problematic for a company focused on growing (or at least maintaining) its engagement numbers.


We don’t just report on threats – we help protect your social media

Cybersecurity risks should never spread beyond a headline. Protect your social media accounts by using Malwarebytes Identity Theft Protection.

About the author

Danny Bradbury has been a journalist specialising in technology since 1989 and a freelance writer since 1994. He covers a broad variety of technology issues for audiences ranging from consumers through to software developers and CIOs. He also ghostwrites articles for many C-suite business executives in the technology sector. He hails from the UK but now lives in Western Canada.