Anthropogenic supply chain risk designation suspended by judge

anthropogenic-supply-chain-risk-designation-suspended-by-judge

Anthropogenic supply chain risk designation suspended by judge

Anthropic won a preliminary injunction prohibiting the US Department of Defense from labeling it a risk to the supply chainpotentially paving the way for customers to resume working with the company. Thursday’s ruling by San Francisco federal district judge Rita Lin is a symbolic setback for the Pentagon and a significant boost for the generative AI company as it attempts to preserve its business and reputation.

“Defendants’ designation of Anthropic as a ‘supply chain risk’ is likely both contrary to law and arbitrary and capricious,” Lin said. wrote to justify the temporary exemption. “The War Department provides no legitimate basis for inferring from Anthropic’s frank insistence on use restrictions that it could become a saboteur.”

Anthropic and the Pentagon did not immediately respond to requests for comment on the decision.

The Defense Department, which under Trump calls itself the War Department, has relied on Anthropic’s Claude AI tools to draft sensitive documents and analyze classified data for the past two years. But this month, he began ending Claude after determining that Anthropic we couldn’t trust. Pentagon officials cited numerous instances in which Anthropic imposed or sought to impose usage restrictions on its technology that the Trump administration deemed unnecessary.

The administration ultimately issued several directives, including designating the company as a supply chain risk, which had the effect of slowly halting Claude’s use within the federal government and damaging Anthropic’s sales and public reputation. The company filed two lawsuits challenging the sanctions as unconstitutional. At a hearing Tuesday, Lin the government said had appeared to illegally “cripple” and “punish” Anthropic.

Lin’s decision Thursday “restores the status quo” until Feb. 27, before the guidelines are released. “This does not prohibit any defendant from taking any legal action available to him or her” on that date, she wrote. “For example, this order does not require the Department of War to use Anthropic’s products or services and does not prevent the Department of War from switching to other artificial intelligence vendors, provided that such actions comply with applicable regulations, laws, and constitutional provisions.”

The decision suggests that the Pentagon and other federal agencies are still free to cancel agreements with Anthropic and ask contractors that integrate Claude into their own tools to stop doing so, but without citing the supply chain risk designation as a basis.

The immediate impact is unclear because Lin’s order won’t take effect for another week. And a federal appeals court in Washington, D.C., has yet to rule on Anthropic’s second lawsuit, which focuses on a different law under which the company was also barred from providing software to the military.

But Anthropic could use Lin’s decision to demonstrate to some customers keen to work with an industry pariah that the law can be on his side in the long term. Lin did not set a timetable for making a final decision.

Exit mobile version