AI firm Anthropic rejects unrestricted US military use - DW

Anthropic has refused to allow unrestricted military use of its AI technology by the US Pentagon, citing concerns over mass surveillance and autonomous weapons that could undermine democratic values. The company, which was contracted by the Pentagon in 2025, faced an ultimatum to provide AI for classified military purposes or lose its contract, with threats to invoke legislation such as the Defense Production Act. Anthropic's CEO emphasized its stance that AI should not pose risks to civilians or democracy, despite military assertions that they do not wish to use AI for mass surveillance or autonomous weapons.

Source ↗
AI firm Anthropic rejects unrestricted US military use - DW

United States of America

AI firm Anthropic rejects unrestricted US military use

February 27, 2026The US artificial intelligence company Anthropic said on Thursday that it would not give in to pressure from the Pentagon to allow unrestricted military use of its technology for fear that it could be used for mass surveillance or in fully-autonomous weapons.

What did Anthropic say?

"Using these systems for mass domestic surveillance is incompatible with democratic values," said chief executive Dario Amodei, adding that AI systems are not yet reliable enough to be trusted to power deadly weapons without a human in ultimate control.

"We will not knowingly provide a product that puts America's warfighters and civilians at risk," he said.

Anthropic – alongside OpenAI, Google and Elon Musk's Grok chatbot – was commissioned by the Pentagon in 2025 to supply AI models for a range of military applications in a contract worth $200 million (€170 million).

Earlier this week, after a meeting with Amodei, Defense Secretary Pete Hegseth gave Anthropic an ultimatum: Open up its artificial intelligence technology for use in a "classified setting" by Friday, as the other companies have, or risk losing its government contract.

Military officials warned that they could go even further and designate the company as a supply chain risk, or invoke Cold War-era legislation called the Defense Production Act to give the military more sweeping authority to use its products.

"These threats do not change our position: we cannot in good conscience accede to their request," said Amodei, calling the threats "inherently contradictory" since they label Anthropic's systems simultaneously a "security risk" and "essential to national security."

"Anthropic understands that the Department of War, not private companies, makes military decisions," he said, referring to the Trump administration's title for the Department of Defense. "However, in a narrow set of cases, we believe AI can undermine, rather than defend, democratic values."

Pentagon: 'No interest in using AI for mass surveillance'

Pentagon spokesman Sean Parnell said on social media Thursday that the military "has no interest in using AI to conduct mass surveillance of Americans (which is illegal) nor do we want to use AI to develop autonomous weapons that operate without human involvement."

However, officials also confirmed one exchange between Anthropic and the Pentagon which had revolved around intercontinental ballistic missiles, underscoring the sensitivity of the topics at the heart of the dispute.

Parnell said opening up use of the technology would prevent Anthropic from "jeopardizing critical military operations," warning: "We will not let ANY company dictate the terms regarding how we make operational decisions."

Meanwhile, the ranking Democrat on the Senate Intelligence Committee, Senator Mark Warner of Virginia, said he was "deeply disturbed" by reports that the Pentagon is "working to bully a leading US company."

He said in a statement that "this is further indication that the Department of Defense seeks to completely ignore AI governance" and "further underscores the need for Congress to enact strong, binding AI governance mechanisms for national security contexts."

Edited by: Wesley Dockery

Filed under: Foreign Entanglements

Comments (0)

No comments yet. Be the first to share your thoughts.

Sign in to leave a comment.