Meta Threatens to Exit New Mexico Amid Child Safety Trial

Meta Threatens to Exit New Mexico Amid Child Safety Trial

Meta Threatens to Pull Platforms From New Mexico as Child Safety Trial Enters Final Phase

On April 30, 2026, Meta filed a court document warning that it may remove Facebook, Instagram, and WhatsApp from New Mexico entirely rather than comply with the state's sweeping child safety demands — a threat that New Mexico Attorney General Raúl Torrez immediately dismissed as proof that the company prioritizes market control over child protection. The escalating standoff sets the stage for a high-stakes bench trial scheduled to begin May 4, 2026, in which New Mexico is seeking what it describes as the most far-reaching injunctive relief ever proposed against a major social media company.

The warning came just days before the second phase of The State of New Mexico v. Meta Platforms, Inc. — a case that already produced a landmark $375 million jury verdict against Meta on March 24, 2026, the first time any U.S. state had prevailed at trial against a major social media platform on child safety claims brought by a state attorney general.

Meta's Threat: 'No Choice But to Remove Access'

In its April 30 court filing, Meta argued that New Mexico's proposed injunctive relief is so broad and technically demanding that compliance may be functionally impossible. The company warned that if an agreement cannot be reached, it may be forced to exit the state's market entirely.

"While it is not in Meta's interests to do so, if a workable solution to Attorney General Torrez's demands is not reached, we may have no choice but to remove access to its platforms for users in New Mexico entirely," Meta stated in the court filing, as reported by multiple outlets including NBC News and The Hill.

In a separate passage from the same filing, Meta elaborated: "The State's requests for relief are so broad and so burdensome, that if implemented it might force Meta to withdraw its apps entirely from the State of New Mexico as an alternative way of complying with the injunction."

Meta specifically contested New Mexico's demand that the company detect at least 99 percent of all new child sexual abuse material (CSAM), calling the target technically unachievable. The company similarly characterized a 99 percent accurate age-verification system as impossible. Meta also pushed back on the state's demand to eliminate its infinite scroll feature, which the company described as a "foundational aspect" of its platforms. A Meta spokesperson described New Mexico's demands broadly as "technically impractical, impossible for any company to meet and disregard the realities of the internet."

Meta said the demands were "in many cases technologically impractical or completely impossible" and would "essentially require" the company to build apps specific to New Mexico — a state with a population of roughly 2.1 million people.

AG Torrez Fires Back: 'This Is Not About Technological Capability'

Attorney General Torrez did not accept Meta's framing. Responding directly to the court filing on April 30, Torrez said Meta was "showing the world how little it cares about child safety" — a statement that drew immediate national attention.

"Meta is showing the world how little it cares about child safety," Torrez said, as reported by Bloomberg Law and The Hill.

Torrez also directly challenged Meta's claim that the state's demands are technically out of reach. "For years the company has rewritten its own rules, redesigned its products and even bent to the demands of dictators to preserve market access," he said. "This is not about technological capability."

With the bench trial approaching, Torrez made clear that the state intends to push for structural, enforceable change. "On May 4, we will seek the strongest child safety protections ever proposed against a social media company — and we will ask this court to order Meta to comply," he said in a statement from the New Mexico Department of Justice.

What New Mexico Is Asking For

New Mexico's proposed injunctive relief is sweeping in scope. According to the New Mexico Department of Justice, the state is seeking mandatory age verification, private-by-default accounts for minors, bans on addictive features including infinite scroll and algorithmic amplification, a 90-hour monthly usage cap for minors, a 99 percent CSAM detection requirement, and the appointment of a court-supervised independent "child safety monitor" funded entirely by Meta.

The proposed injunction, if granted, would remain in force for a minimum of five years across Facebook, Instagram, and WhatsApp, regardless of any future changes in ownership or corporate structure. The bench trial will be heard by 1st Judicial District Court Chief Judge Bryan Biedscheid, who earlier rejected Meta's motion to postpone the proceedings.

Torrez outlined the shape of the relief in an earlier statement: "We're going to be asking for injunctive relief. That means changes to the design features of the platform itself, real age verification, changes to the algorithm, an independent monitor to oversee those changes and fundamentally a demand that they do business differently in New Mexico."

How the Case Got Here: A Landmark Verdict and an Undercover Investigation

The case originated in December 2023, when AG Torrez filed suit against Meta following an undercover investigation by the New Mexico Department of Justice. NMDOJ agents created accounts on Facebook and Instagram posing as users younger than 14. Those accounts received sexually explicit material and were contacted by adults seeking similar content, leading to criminal charges against multiple individuals.

The lawsuit alleged that Meta violated New Mexico's Unfair Practices Act and created a public nuisance by designing platforms that facilitated child sexual exploitation, addiction, and mental health harm, while publicly misrepresenting the safety of its products. The court denied Meta's motion to dismiss the case and also rejected its Section 230 immunity defense in May 2024, allowing the matter to proceed to trial.

The jury trial — which included testimony from former Meta employees, law enforcement officials, and New Mexico educators, as well as internal company documents — concluded on March 24, 2026, with the jury finding Meta liable on all counts. The jury found Meta responsible for 75,000 violations of state law and ordered the company to pay $375 million in civil penalties at the statutory maximum of $5,000 per violation.

According to CNBC, the verdict marked the first time a U.S. state prevailed at trial against a major social media company on child safety claims brought by a state attorney general. New Mexico officials had originally sought approximately $2.1 billion in penalties; the jury awarded the statutory maximum of $375 million.

Torrez called the result a turning point. "The jury's verdict is a historic victory for every child and family who has paid the price for Meta's choice to put profits over kids' safety," he said in a New Mexico Department of Justice press release.

Meta responded by announcing its intention to appeal. "We respectfully disagree with the verdict and will appeal," a Meta spokesperson said.

Why This Case Matters Beyond New Mexico

The legal battle between New Mexico and Meta has implications that extend well beyond state lines. The March verdict was already a legal first — no U.S. state attorney general had previously taken a major social media company to trial and won on child safety grounds. Phase 2 could produce another precedent: a court-ordered structural redesign of platforms used by billions of people worldwide.

The injunctive relief New Mexico is pursuing — including mandatory age verification, algorithmic restrictions, usage caps, and an independent compliance monitor — represents the kind of platform-level intervention that policymakers and child safety advocates have long argued is necessary but that the tech industry has consistently resisted as technically impractical or constitutionally questionable.

Meta's threat to withdraw its platforms from New Mexico rather than comply is itself significant. It signals that the company views the proposed remedies as more disruptive to its business model than an outright market exit — and it raises the question of whether other states pursuing similar litigation could face the same ultimatum.

At the same time, Meta's claim that a 99 percent CSAM detection rate and a 99 percent accurate age-verification system are technically impossible remains contested. AG Torrez's response — pointing to Meta's history of adapting its products to satisfy foreign governments — frames the company's position not as a technical limitation but as a choice about where to apply engineering resources.

The bench trial beginning May 4 before Chief Judge Biedscheid will determine whether New Mexico's proposed injunction is legally and technically sound — and whether a court can compel a platform the size of Meta to fundamentally redesign how it operates in a single U.S. state.

For more tech news, visit our news section.

What Comes Next

The Phase 2 bench trial begins May 4, 2026, addressing New Mexico's public nuisance claim and the full scope of the proposed injunctive relief. Chief Judge Bryan Biedscheid has already denied Meta's attempt to delay the proceedings. Meta has stated it intends to appeal the March jury verdict, meaning the $375 million penalty remains subject to further litigation regardless of the Phase 2 outcome.

If the court grants New Mexico's proposed injunction, it would represent an unprecedented judicial intervention into the design of a major social media platform — one that could influence how other state and federal regulators approach similar demands. If Meta follows through on its threat to exit New Mexico rather than comply, that outcome would carry its own legal and reputational consequences, and would likely intensify scrutiny from other jurisdictions watching the case closely.

The intersection of platform design, child safety, and legal accountability is one of the defining tech policy debates of this decade. For individuals — especially parents — this case is a reminder that the digital environments children inhabit are shaped by deliberate product decisions, and that those decisions are now being scrutinized in court.

Moccet covers how technology, health, and productivity intersect in daily life. Understanding how social platforms are designed — and how those designs affect attention, behavior, and wellbeing — is central to making informed choices about screen time, digital habits, and personal health. Join the Moccet waitlist to stay ahead of the curve.

Share:
← Back to Tech News