←back to thread

93 points rbanffy | 1 comments | | HN request time: 0.206s | source
Show context
olao99 ◴[] No.42188229[source]
I fail to understand how these nuclear bomb simulations require so much compute power.

Are they trying to model every single atom?

Is this a case where the physicists in charge get away with programming the most inefficient models possible and then the administration simply replies "oh I guess we'll need a bigger supercomputer"

replies(10): >>42188257 #>>42188268 #>>42188277 #>>42188283 #>>42188293 #>>42188324 #>>42189425 #>>42189704 #>>42189996 #>>42190235 #
p_l ◴[] No.42188283[source]
It literally requires simulating each subatomic particle, individually. The increases of compute power have been used for twin goals of reducing simulation time (letting you run more simulations) and to increase the size and resolution.

The alternative is to literally build and detonate a bomb to get empirical data on given design, which might have problems with replicability (important when applying the results to rest of the stockpile) or how exact the data is.

And remember that there is more than one user of every supercomputer deployed at such labs, whether it be multiple "paying" jobs like research simulations, smaller jobs run to educate, test, and optimize before running full scale work, etc.

AFAIK for considerable amount of time, supercomputers run more than one job at a time, too.

replies(2): >>42188395 #>>42188718 #
pkaye ◴[] No.42188395[source]
Are they always designing new nuclear bombs? Why the ongoing work to simulate?
replies(5): >>42188408 #>>42188476 #>>42188549 #>>42188623 #>>42188738 #
1. dekhn ◴[] No.42188476[source]
The euphemistic term used in the field is "stockpile stewardship", which is a catch-all term involving a wide range of activities, some of them forward-looking.