Three Top AI Models in Simulated War Games
Recommended Using Nukes 95 Percent of
the Time
PJ Media,
by
Rick Moran
Original Article
Posted By: Dreadnought,
2/26/2026 12:48:05 AM
I've got good news and bad news about AI. The good news is that the dreaded "Skynet" takeover of our nuclear weapons systems isn't going to happen soon. The bad news is that if it ever does give us a Terminator scenario, we're toast.
A war game exercise carried out by Kenneth Payne at King’s College London, using three teams running simulations on Chat GPT-5.2, Claude Sonnet 4, and Gemini 3 Flash. The teams "played 21 war games against each other over 329 turns," according to Implicator.AI's Marcus Schuler. "They wrote roughly 780,000 words explaining why they did what they did," he noted.
No model ever chose
Post Reply
Reminder: “WE ARE A SALON AND NOT A SALOON”
Your thoughts, comments, and ideas are always welcome here. But we ask you to please be mindful and respectful. Threatening or crude language doesn't persuade anybody and makes the conversation less enjoyable for fellow L.Dotters.
Reply 1 - Posted by:
Dreadnought 2/26/2026 12:51:12 AM (No. 2073142)
This isn't new to the arrival of AI. Apparently all major government war game outcomes for decades lead to the use of nuclear weapons resulting in general war. AI is trained on that knowledge.
The question is does counterforce (military targets) always lead to countervalue (cities) weapons use.
3 people like this.
Reply 2 - Posted by:
kono 2/26/2026 2:01:30 AM (No. 2073144)
Did any of them ask, "How about a nice game of chess?"
10 people like this.
Reply 3 - Posted by:
DVC 2/26/2026 2:24:06 AM (No. 2073145)
AI doesn't care in the slightest about wiping out millions or making the planet radioactive and parts uninhabitable for years.
Watch the movie "Wargames" to see what the AI battle management system might be like.
8 people like this.
Reply 4 - Posted by:
BarryNo 2/26/2026 5:09:25 AM (No. 2073156)
The programmers have a gamer psychology.
In games, you can always try again, so, especially at the 'final boss' you go for the win. There is almost no sense of self-reservation, which you would get from a design that acknowledges it's own 'death' with zero second chances. Incorporating a loss of awareness of the battlefield with the enemy still able to hit you from those blind areas, might help.
But all scenarios, win or lose, end at the close of escalated conflict. Why not chose nukes, if that's the case?
3 people like this.
Reply 5 - Posted by:
seamusm 2/26/2026 6:46:34 AM (No. 2073181)
And this is exactly why any decision to use a weapon - especially nukes - should never be left in the hands of anyone (or anything) without a pulse or as in the case of our last President - a brain.
11 people like this.
Reply 6 - Posted by:
chumley 2/26/2026 6:56:48 AM (No. 2073185)
As far as I know AI still does not understand human emotion. They do not understand what some of us believe is the sanctity of life; not to be snuffed out without good cause. They do not understand the fear of losing everyone, everything and every place that was ever precious to us. After all, we build careers and have families to provide for a future, not to sacrifice it all for untouchable pedophiles. We have empathy for others and do not want to obliterate them either.
AI may know the numbers, but they arent the ones doing the dying and wouldn't care if they were.
5 people like this.
Reply 7 - Posted by:
mossley 2/26/2026 8:00:12 AM (No. 2073202)
Despite the I in AI, it has no intelligence. It doesn't understand what it's doing. It's making decisions based on prior information. That's why AI designed floor plans will have bathrooms with 17 sinks but no doors into the house. It doesn't understand that a house needs a way in or out, but it has a database of existing designs that have a sink in a bathroom.
The started objective is to win. The best way of winning is to use your most powerful weapons. AI doesn't understand the consequences of that. It doesn't understand that sometimes it's better to deescalate a situation.
4 people like this.
Reply 8 - Posted by:
Strike3 2/26/2026 8:51:25 AM (No. 2073219)
It's disturbing that they are allowing this infantile excuse for intelligent software even play these games, which shows you where their heads are. On the other hand, maybe the software is smarter than the fools who think that a nuclear war has some chance of being won by anybody. The only certain outcome is that people in submarines and underground bunkers will live slightly longer than those on the surface.
3 people like this.
Reply 9 - Posted by:
MickTurn 2/26/2026 2:02:12 PM (No. 2073335)
AI, Artificial Idiocy...the answer is 1 or 0, it's that simple. Either there isn't a problem...1 or we NUKE THEM...0
0 people like this.
Reply 10 - Posted by:
LC Chihuahua 2/26/2026 2:33:09 PM (No. 2073350)
AI is just programming at its heart. Imagine a Muslim terrorist group or an Antifa organization creating an AI. It would say exactly the same thing.
We live in an age of influencing. AI will be another aspect of that. The goal is to make intelligent people turn off their brains and rely on others for thinking. Seriously, don't. Consider our brains a survival mechanism. Typically, our brain will never put us in harm's way. So many influencers are deliberately giving bad ideas. It's all around us.
2 people like this.
Below, you will find ...
Most Recent Articles posted by "Dreadnought"
and
Most Active Articles (last 48 hours)