Pentagon against the AI company that refused to become a weapon

You have built the world’s safest AI, landed a $200 million contract with the US Department of Defense, and then the Secretary of Defense knocks on your door with an ultimatum. Remove all safety locks, or we pull out. That is the situation for Anthropic right now, and it is starting to look more like a bad action movie than business negotiations.

Ok a bit of background to the story

Anthropic, the company behind the AI model Claude, has long wanted to be the serious player in the AI industry. While competitors like OpenAI, Google, and Elon Musk’s xAI have readily bowed to the military’s demands, Anthropic has stuck to two things. No weapons systems operating completely without human control and no mass surveillance of American citizens. Quite reasonable really. Not according to the Pentagon.

The meeting that definitely was not a pleasant coffee break

On Tuesday, February 24, Secretary of Defense Pete Hegseth and Anthropics CEO Dario Amodei entered a room at the Pentagon. An insider described the meeting beforehand as a shit or get off the pot meeting, which is probably the most straightforward diplomatic phrasing in a long time. Hegseth put an ultimatum on the table with a deadline on Friday. Sign a document giving the military free access to Claude, or you will pay the price.

Ironically, the meeting was said to be strangely polite. Amodei even thanked Hegseth for his service. No voices were raised. But when they left the room, no one had changed their mind at all.

The Pentagon needs Anthropic more than they want to admit

Here lies the entire irony of the story. Claude is currently the only AI model approved for the military’s most classified networks. Not a single competitor is anywhere near that status yet. A Pentagon official expressed it with an honesty that is almost touching. The only reason we are still talking to these people is we need them and we need them now. The problem for these guys is they are that good.

The Pentagon is thus threatening to fire the only supplier they cannot do without. It is a bit like threatening your heart surgeon with dismissal in the middle of an ongoing operation.

What does woke AI actually mean?

Hegseth and the Trump administration have labeled Anthropics’ safety rules as woke AI. AI experts argue that the term is essentially meaningless and is used to describe everything from technical safety mechanisms to alleged political bias in the responses. What is actually wanted is an AI that the military can use for all lawful purposes, a term broad enough to cover most things one can think of.

Three unpleasant options from the Pentagon

The Pentagon has presented three options:

  • The first is to cancel the $200 million contract, which sounds dramatic but is hardly noticeable for Anthropic whose revenue is around $14 billion a year.
  • The second is to label Anthropic as a supply chain risk, a term otherwise reserved for foreign enemies like China and Russia. This would force all of the Pentagon’s other suppliers to certify that they do not use Claude in their own systems, an administrative nightmare considering eight out of the ten largest American companies use Claude daily.
  • The third option is for the Pentagon to invoke the Defense Production Act to force an agreement on the military’s terms, a move so unusually aggressive that legal experts are already questioning whether it would hold up in court.

Deadline from the Pentagon is Friday

Friday is the deadline. Amodei has made it clear that Anthropic can adjust its policy but will never accept mass surveillance or weapon systems without human control. What the Pentagon calls unreasonable restrictions are exactly the reasons that led Amodei to leave OpenAI and start Anthropic in the first place.

AI already plays a central role in military decisions, that discussion is over. What remains is the question of who sets the boundaries for how the technology is used and whether it should be elected politicians or tech companies who have the final say when it comes to autonomous weapons systems and citizen surveillance.

Sources

Följ på
Search
Poppis
Loading

Signing-in 3 seconds...

Signing-up 3 seconds...

Cart
Cart updating

ShopYour cart is currently is empty. You could visit our shop and start shopping.