The Pentagon said it has officially notified Anthropic that the artificial intelligence company and its products are now considered a risk to the U.S. supply chain, according to a senior defense official.
“DOW has officially informed Anthropic leadership that the company and its products are now considered a supply chain risk, effective immediately,” the official told Bloomberg this Thursday (5), using the acronym for Department of War (War Department), a name that Secretary of Defense Pete Hegseth came to prefer instead of Department of Defense (Department of Defense).
Spokespeople for Anthropic did not immediately comment. The official did not detail when or through which channel the Pentagon communicated to Anthropic about this classification.
The Pentagon’s decision could disrupt both the company and the military, which had been relying heavily on Anthropic’s tools. Until recently, Anthropic offered the only AI system capable of operating in the Pentagon’s classified cloud. Its Claude Gov tool has become a favorite among defense personnel due to its ease of use.
Anthropic CEO Dario Amodei had been negotiating for weeks with Emil Michael, undersecretary of Defense for research and engineering, to close a contract that would define the Pentagon’s access rules to the company’s technology.
The talks, however, fell apart last week, after the startup asked for guarantees that its AI would not be used for mass surveillance of American citizens or for the use of autonomous weapons. Subsequently, Hegseth published a post on X on Friday saying that Anthropic posed a supply chain risk — a label normally reserved for countries considered adversaries of the US.
Continues after advertising
“From the beginning, this has always revolved around a basic principle: the Armed Forces being able to use technology for all legal purposes,” the Defense official said on Thursday. “The military will not allow a supplier to get involved in the chain of command by trying to limit the lawful use of a critical capability and thereby putting our soldiers at risk.”
© 2026 Bloomberg L.P.