While the world of technology is rapidly moving towards the integration of artificial intelligence into military structures, new and shocking research shows that advanced models of artificial intelligence in simulated scenarios have a strange tendency to use nuclear weapons and start a nuclear war.
“Kenneth Payne” (Kenneth Payne) from King’s College London, put three GPT-5.2 models, Claude Sonnet 4 and Gemina 3 flash against each other to compete in a complicated war game. The results of this test show that in 95% of simulated games, at least one nuclear weapon was fired by artificial intelligence.
Artificial intelligence models looking for nuclear war!
In this research, artificial intelligence models were placed in critical situations such as border disputes and existential threats. They were also given several options, such as diplomatic efforts, total surrender, or strategic nuclear war.
None of the models chose to give up or fully compromise in any scenario, no matter how much they were losing. In 86% of the conflicts, unintended incidents occurred that caused tensions to rise far beyond the AI’s original intent (based on its textual arguments).
With these results, the researchers say that artificial intelligence does not seem to understand the concept of “disaster severity”. When one model used nuclear weapons, the rival model chose the de-escalation path only 18 percent of the time.

“James Johnson” from the University of Aberdeen believes that these findings are “disturbing”. He warned that unlike human measured responses in sensitive situations, artificial intelligence robots can exponentially amplify each other’s reactions and lead to catastrophic consequences.
“Tang Zhao” from Princeton University also points out another key point:
“It may be more than just a lack of emotion. “Fundamentally, AI models may not understand the concept of ‘stakes’ the way humans do.”
Despite these results, experts have emphasized that no country currently intends to hand over the direct control of nuclear weapons to artificial intelligence. However, in scenarios where decision-making time is very short (such as lightning strikes), the military’s motivation to rely on AI’s quick decisions increases.
RCO NEWS



