Highlights
- Elon Musk's xAI has signed a deal with the Pentagon to use Grok for battlefield analysis.
- OpenAI has also struck a Pentagon deal for ChatGPT to be used in classified systems, sparking user backlash.
- Anthropic was cut off from Pentagon contracts after defence secretary Pete Hegseth called it a "supply-chain risk to national security".
The deals raise serious questions about the future of AI in warfare. Jurgita Lapienytė, chief editor at Cybernews, told Metro "Currently, AI is not only untrustworthy but also very dangerous when unsupervised.
In military operations, it can also be used to dehumanize operations by offering gamified experiences for officers and soldiers and shifting personal responsibility."
Anthropic faces backlash
The developments come after Claude, an AI model by Anthropic, was reportedly used by the Pentagon to conduct strikes on Iran, with US military command using the tools to pick targets and carry out battlefield simulations.
However, hours later Trump ordered agencies to cut ties with Anthropic, with defence secretary Pete Hegseth calling the company a "supply-chain risk to national security."
Anthropic questioned why the administration would use a term "historically reserved for US adversaries" and said it would challenge the decision in court.
Singer Katy Perry publicly switched to Claude from ChatGPT following OpenAI's Pentagon deal, writing simply "Done" on X.
OpenAI chief executive Sam Altman confirmed ChatGPT would not be used for domestic surveillance or building autonomous weapons, saying the Pentagon has "deep respect for safety."
He added that OpenAI would build toughened safety guardrails "to ensure our models behave as they should."
Scores of users on Reddit said they were abandoning ChatGPT over the deal, with one writing "You're now training a war machine."
Lapienytė raised broader concerns, asking: "When the world's most powerful military starts using AI without being transparent about exactly how, one can begin to wonder just how much US operations overseas are influenced by the algorithm."





