Meta goes to trial in child safety case in New Mexico. Here’s what’s at stake

meta-goes-to-trial-in-child-safety-case-in-new-mexico.-here’s-what’s-at-stake

Meta goes to trial in child safety case in New Mexico. Here’s what’s at stake

Today Meta went will be tried in the state of New Mexico for allegedly failing to protect minors from sexual exploitation on its apps, including Facebook And Instagram. The state claims Meta violated New Mexico’s Unfair Practices Act by implementing design features and algorithms that created unsafe conditions for users. Now, more than two years after the case was filed, oral arguments have begun in Santa Fe.

It’s a big week for Meta in court: A historic social media trial also kicks off today in California, the nation’s first legal test of social media addiction. This case is part of a “JCCP,” or Judicial Council Coordinated Proceeding, which brings together numerous civil lawsuits focused on similar issues.

The plaintiffs in this case allege that social media companies designed their products carelessly and caused various harm to minors using their applications. Snap, TikTok and Google were named as defendants alongside Meta; Snap and TikTok have already settled. Meta’s failure to do so means that some of the company’s top executives could be called to the witness stand in the coming weeks.

Meta executives, including Mark Zuckerberg, are unlikely to testify live at the trial in New Mexico. But the debates can still be notable for several reasons. This is the first independent state-led case against Meta to actually go to trial in the United States. This is also a highly charged case involving allegations of child sexual exploitation that will ultimately rely on some very technical arguments, including what it means to “mislead” the public, how algorithmic amplification works on social media, and what protections Meta and other social media platforms have under Section 230.

And while Meta’s top executives may not be required to appear in person, the executives’ depositions and testimony from other witnesses could still offer an interesting glimpse into the company’s inner workings as it established policies regarding underage users and responded to complaints that it wasn’t doing enough to protect them.

Meta has so far given no indication that it intends to fix the problem. The company has denied the allegations, and Meta spokesperson Aaron Simpson previously told WIRED: “While New Mexico makes sensationalist, irrelevant and distracting arguments, we strive to demonstrate our long-standing commitment to supporting young people… We are proud of the progress we have made and we are always working to do better.”

Sacha Haworth, executive director of the Tech Oversight Project, a tech industry watchdog, said in an emailed statement that these two trials represent “the split-screen of Mark Zuckerberg’s nightmares: a landmark trial in Los Angeles over children’s addiction to Facebook and Instagram, and a trial in New Mexico revealing how Meta enabled predators to use social media to exploit and abuse children.” »

“These are the trials of a generation,” Haworth added. “Just as the world has watched the courts hold tobacco and pharmaceutical giants to account, we will see, for the first time, CEOs of major technology companies like Zuckerberg take the stand.”

The cost of doing business

New Mexico Attorney General Raúl Torrez filed his complaint against Meta in December 2023. In that complaint, he alleged that Meta proactively offered explicit content to minor users, allowed adults to exploit children on the platform, made it easy for Facebook and Instagram users to find child pornography, and allowed an investigator on the case, pretending to be a mother, to offer her minor daughter to sex traffickers.

The trial is expected to take place over seven weeks. Last week, the jurors were selected, a panel of 10 women and eight men (12 jurors and six alternates). New Mexico First Judicial District Judge Bryan Biedscheid is presiding over the case.

In the months leading up to the trial, Meta submitted more than 40 motions in limine, or requests that the judge exclude or limit certain information that could unfairly influence a jury. These are standard procedures in cases like this, and Meta has the right to argue that some content brought to court may not be relevant to the case at hand.

But WIRED reported that Meta’s preliminary requests were numerous. Among those requests, Meta asked the judge to prohibit the court from mentioning Mark Zuckerberg’s time as an undergraduate at Harvard University; referring to the financial position or wealth of the company; citing articles by the former US surgeon general on the harms of social media on mental health; citing third-party investigations or Meta’s own internal investigations that claim to show a large amount of inappropriate content on Meta’s applications; referring to Meta’s AI chatbots; and more.

Some Meta requests were granted. For example, at Meta’s request, the judge ruled that the word “whistleblower” was not allowed in the courtroom, and during opening arguments this morning, the plaintiff instead used phrases like “former employees who have expertise.” But other requests, like Meta’s asking the court to ban references to mental health issues, AI chatbots or third-party investigations, have been rejected. Meta also sought to block a live video broadcast of the trial, but was rejected.

Meta retained outside counsel from the Washington, D.C.-based litigation firm Kellogg, Hansen, Todd, Figel & Frederick to assist with this matter. Previously, the company defended Meta in a case brought against the company by the FTC, which argued that Meta had a monopoly on personal social networks and that it should reverse its acquisitions of Instagram and WhatsApp. The antitrust case was seen as the most significant threat to Meta’s dominant position since the company’s founding, but Meta’s lawyers successfully argued that the company faced stiff competition from other social media platforms.

Meta is hoping for another victory in New Mexico and will try to prove that it has been proactive in removing harmful content from its platforms and transparent with users about potential risks. The company is also likely to reference the protections of Section 230, a well-known provision of the Communications Decency Act of 1996, which protects online platforms from liability for third-party content that people post on their sites.

“Meta is likely to make a variety of arguments, some of which may have merit, some of which may have none, because when we look at their previous filings or positions in litigation, that has been their pattern,” says Mary Graw Leary, a leading scholar of criminal law and criminal procedure at the Catholic University of America in Washington, D.C., who is not involved in the Meta case. She points out that one of the reasons Meta and other tech platforms have relied heavily on Section 230 in response to lawsuits is that if the argument is successful, courts will throw out the case before discovery.

“They have the right to vigorously defend themselves, but I suspect that because this lawsuit may be considered a ‘cost of doing business,’ Meta will do its best to exclude information they don’t want the public to see,” Graw Leary said.

In a series of posts on

Chelsea Pitvorec, deputy director of communications at the New Mexico Department of Justice, said the state’s lawsuit alleges that Meta has misled the public about the dangers on its platforms for years, and that instead of making its products safer, “Meta is spending its time and resources falsely defaming law enforcement officials who put child predators behind bars. The company is distracting from New Mexico’s undercover investigation because even Meta’s highest-paid PR agents can’t explain why Meta’s platforms expose children to criminals.

“We look forward to presenting the jury with the evidence we have obtained over more than two years of litigation,” Pitvorec said.

In terms of recourse, the State is seeking civil penalties of up to $5,000 per violation of the Unfair Practices Act, which, depending on how the violations are counted, could mean millions or even hundreds of millions of dollars for Meta. And Torrez has already signaled that he wants Meta to make significant changes to its platform. In a letter sent to Zuckerberg and Instagram chief Adam Mosseri in December 2025, he asked executives to stop marketing the Teen account’s content as “PG-13” and implement meaningful safety protections for children.

“This also includes, among other steps, quickly and broadly implementing effective age verification, removing bad actors from the platform, combating harmful algorithms that proactively spread dangerous content, and addressing security risks created by end-to-end encryption,” Torrez wrote.

Exit mobile version