Most active commenters
  • p_l(5)
  • Jabbles(3)

←back to thread

93 points rbanffy | 34 comments | | HN request time: 0.982s | source | bottom
1. olao99 ◴[] No.42188229[source]
I fail to understand how these nuclear bomb simulations require so much compute power.

Are they trying to model every single atom?

Is this a case where the physicists in charge get away with programming the most inefficient models possible and then the administration simply replies "oh I guess we'll need a bigger supercomputer"

replies(10): >>42188257 #>>42188268 #>>42188277 #>>42188283 #>>42188293 #>>42188324 #>>42189425 #>>42189704 #>>42189996 #>>42190235 #
2. TeMPOraL ◴[] No.42188257[source]
Pot, meet kettle? It's usually the industry that's leading with "write inefficient code, hardware is cheaper than dev time" approach. If anything, I'd expect a long-running physics research project to have well-optimized code. After all, that's where all the optimized math routines come from.
replies(1): >>42191206 #
3. bongodongobob ◴[] No.42188268[source]
My brother in Christ, it's a supercomputer. What an odd question.
4. CapitalistCartr ◴[] No.42188277[source]
It's because of the way the weapons are designed, which requires a CNWDI clearance to know, so your curiosity is not likely to be sated.
replies(2): >>42188478 #>>42190500 #
5. p_l ◴[] No.42188283[source]
It literally requires simulating each subatomic particle, individually. The increases of compute power have been used for twin goals of reducing simulation time (letting you run more simulations) and to increase the size and resolution.

The alternative is to literally build and detonate a bomb to get empirical data on given design, which might have problems with replicability (important when applying the results to rest of the stockpile) or how exact the data is.

And remember that there is more than one user of every supercomputer deployed at such labs, whether it be multiple "paying" jobs like research simulations, smaller jobs run to educate, test, and optimize before running full scale work, etc.

AFAIK for considerable amount of time, supercomputers run more than one job at a time, too.

replies(2): >>42188395 #>>42188718 #
6. JumpCrisscross ◴[] No.42188293[source]
> Are they trying to model every single atom?

Given all nuclear physics happens inside atoms, I'd hope they're being more precise.

Note that a frontier of fusion physics is characterising plasma flows. So even at the atom-by-atom level, we're nowhere close to a solved problem.

replies(1): >>42188333 #
7. alephnerd ◴[] No.42188324[source]
> I fail to understand how these nuclear bomb simulations require so much compute power

I wrote a previous HN comment explaining this:

Tl;dr - Monte Carlo Simulations are hard and the NPT prevents live testing similar to Bikini Atoll or Semipalatinsk-21

https://news.ycombinator.com/item?id=39515697

8. amelius ◴[] No.42188333[source]
Or maybe it suffices to model the whole thing as a gas. It all depends on what they're trying to compute.
replies(1): >>42188352 #
9. JumpCrisscross ◴[] No.42188352{3}[source]
> maybe it suffices to model the whole thing as a gas

What are you basing this on? Plasmas don't flow like gases even absent a magnetic field. They're self interacting, even in supersonic modes. This is like saying you can just model gases like liquids when trying to describe a plane--they're different states of matter.

10. pkaye ◴[] No.42188395[source]
Are they always designing new nuclear bombs? Why the ongoing work to simulate?
replies(5): >>42188408 #>>42188476 #>>42188549 #>>42188623 #>>42188738 #
11. danhon ◴[] No.42188408{3}[source]
It's also to check that the ones they have will still work, now that there are test bans.
12. dekhn ◴[] No.42188476{3}[source]
The euphemistic term used in the field is "stockpile stewardship", which is a catch-all term involving a wide range of activities, some of them forward-looking.
13. nordsieck ◴[] No.42188478[source]
> It's because of the way the weapons are designed, which requires a CNWDI clearance to know, so your curiosity is not likely to be sated.

While that's true, the information that is online is surprisingly detailed.

For example, this series "Nuclear 101: How Nuclear Bombs Work"

https://www.youtube.com/watch?v=zVhQOhxb1Mc

https://www.youtube.com/watch?v=MnW7DxsJth0

replies(1): >>42188865 #
14. p_l ◴[] No.42188549{3}[source]
Because even normal explosives degenerate over time, and fissile material in nuclear devices is even worse about it - remember that unstable elements are ongoing constant fission events, critical mass is just one where they trigger each others' fission fast enough for runaway process.

So in order to verify that the weapons are still useful and won't fail in random ways, you have to test them.

Which either involves actually exploding them (banned by various treaties that have enough weight that even USA doesn't break them), or numerical simulations.

15. AlotOfReading ◴[] No.42188623{3}[source]
Multiple birds with one stone.

* It's a jobs program to avoid the knowledge loss created by the end of the cold war. The US government poured a lot of money into recreating the institutional knowledge needed to build weapons (e.g. materials like FOGBANK) and it's preferred to maintain that knowledge by having people work on nuclear programs that aren't quite so objectionable as weapon design.

* It helps you better understand the existing weapons stockpiles and how they're aging.

* It's an obvious demonstration of your capabilities and funding for deterrence purposes.

* It's political posturing to have a big supercomputer and the DoE is one of the few agencies with both the means and the motivation to do so publicly. This has supposedly been a major motivator for the Chinese supercomputers.

There's all sorts of minor ancillary benefits that come out of these efforts too.

16. Jabbles ◴[] No.42188718[source]
> It literally requires simulating each subatomic particle, individually.

Citation needed.

1 gram of Uranium 235 contains 2e21 atoms, which would take 15 minutes for this supercomputer to count.

"nuclear bomb simulations" do not need to simulate every atom.

I speculate that there will be some simulations at the subatomic scale, and they will be used to inform other simulations of larger quantities at lower resolutions.

https://www.wolframalpha.com/input?i=atoms+in+1+gram+of+uran...

replies(1): >>42188942 #
17. colonCapitalDee ◴[] No.42188738{3}[source]
Basically yes, we are always designing new nuclear bombs. This isn't done to increase yield, we've actually been moving towards lower yield nuclear bombs ever since the mid Cold War. In the 60s the US deployed the B41 bomb with a maximum yield of 25 megatons, making it the most powerful bomb ever deployed by the US. When the B41 was retired in the late 70s, the most powerful bomb in the US arsenal was the B53 with a yield of 9 megatons. The B53 was retired in 2011, leaving the B83 as the most powerful bomb in the US arsenal with a yield of only 1.2 megatons.

There are two kinds of targeting that can be employed in a nuclear war: counterforce and countervalue. Counterforce is targeting enemy military installations, and especially enemy nuclear installations. Countervalue is targeting civilian targets like cities and infrastructure. In an all out nuclear war counterforce targets are saturated with nuclear weapons, with each target receiving multiple strikes to hedge against the risks of weapon failure, weapon interception, and general target survival due to being in a fortified underground positions. Any weapons that are not needed for counterforce saturation strike countervalue targets. It turns out that having a yield greater than a megaton is basically just overkill for both counterforce and countervalue. If you're striking an underground military target (like a missile silo) protected by air defenses, your odds of destroying that target are higher if you use three one megaton yield weapons than if you use a single 20 megaton yield weapon. If you're striking a countervalue target, the devastation caused by a single nuclear detonation will be catastrophic enough to make optimizing for maximum damage pointless.

Thus, weapons designers started to optimize for things other than yield. Safety is a big one, an American nuclear weapon going off on US soil would have far reaching political effects and would likely cause the president to resign. Weapons must fail safely when the bomber carrying them bursts into flames on the tarmac, or when the rail carrying the bomb breaks unexpectedly. They must be resilient against both operator error and malicious sabotage. Oh, and none of these safety considerations are allowed to get in the way of the weapon detonating when it is supposed to. This is really hard to get right!

Another consideration is cost. Nuclear weapons are expensive to make, so a design that can get a high yield out of a small amount of fissile material is preferred. Maintenance, and the cost of maintenance, is also relevant. Will the weapon still work in 30 years, and how much money is required to ensure that?

The final consideration is flexibility and effectiveness. Using a megaton yield weapon on the battlefield to destroy enemy troop concentrations is not a viable tactic because your own troops would likely get caught in the strike. But lower yield weapons suitable for battlefield use (often referred to as tactical nuclear weapons) aren't useful for striking counterforce targets like missile silos. Thus, modern weapon designs are variable yield. The B83 mentioned above can be configured to detonate with a yield in the low kilotons, or up to 1.2 megatons. Thus a single B83 weapon in the US arsenal can cover multiple continencies, making it cheaper and more effective than maintaining a larger arsenal of single yield weapons. This is in addition to special purpose weapons designed to penetrate underground bunkers or destroy satellites via EMP, which have their own design considerations.

replies(4): >>42189309 #>>42189930 #>>42190207 #>>42190387 #
18. CapitalistCartr ◴[] No.42188865{3}[source]
Having once had said clearance limits my answers.
19. p_l ◴[] No.42188942{3}[source]
Subatomic scale is the perfect option, but we tend to not have time for that, so we sample and average and do other things. At least that's the situation within aerospace's hunger for CFD, I figure nuclear has similar approaches.
replies(1): >>42188971 #
20. Jabbles ◴[] No.42188971{4}[source]
I would like a citation for anyone in aerospace using (or even realistically proposing) subatomic fluid dynamics.
replies(1): >>42189017 #
21. p_l ◴[] No.42189017{5}[source]
Ok, that misreading is on me - in aerospace generally you care to level of molecules, and I've met many people who would love to be just able to brute force it this way. Hypersonics do however end up dealing with simulating subatomic particle behaviours (because of things like air turning into plasma)
replies(1): >>42189187 #
22. Jabbles ◴[] No.42189187{6}[source]
> in aerospace generally you care to level of molecules

I would like a citation for this.

> Hypersonics do however end up dealing with simulating subatomic particle behaviours

And this.

---

For example, you could choose to cite "A Study on Plasma Formation on Hypersonic Vehicles using Computational Fluid Dynamics" DOI: 10.13009/EUCASS2023-492 Aerospace Europe Conference 2023 – 10ᵀᴴ EUCASS – 9ᵀᴴ CEAS

At sub-orbital altitudes, air can be modelled as a continuous flow governed by the Navier-Stokes equations for a multicomponent gas mixture. At hypersonic speeds, however, this physical model must account for various non-equilibrium phenomena, including vibrational and electronic energy relaxation, dissociation and ionization.

https://www.eucass.eu/doi/EUCASS2023-492.pdf

replies(1): >>42189366 #
23. dekhn ◴[] No.42189309{4}[source]
Great comment- I have only one thing to add. Many people will enjoy reading "Command and Control" which covers the history of nuclear weapons accidents in the US and how they were managed/mitigated. It's always interesting to learn that a missile silo can explode, popping the warhead up and out (but without it exploding due to fission/fusion), that from the perspective of the nuclear warhead, the safety controls worked.
24. p_l ◴[] No.42189366{7}[source]
"I wish I could give the finger to Navier-Stokes and brute force every molecules kinematics" does not make for a paper that will get to publication if not accompanied with actually doing that at speed and scale that makes it usable, no matter how many tenured professors dream of it. So instead they just ramp up resolution whenever you give them access to more compute

(younger generations are worse at it, because the problems that forced elder ones into more complex approaches can now be an overnight job on their laptop in ANSYS CFX)

So unfortunately my only source on that is bitching of post-docs and professors, with and without tenure (or rather its equivalent here), at premier such institutions in Poland.

25. rcxdude ◴[] No.42189425[source]
>Are they trying to model every single atom?

Modelling a single nucleus, even one much lighter weight than uranium, is a captital-H Hard Problem involving many subject matter experts and a lot of optimisation work far beyond 'just throw it on a GPU'. Quantum systems get non-tractable without very clever approximations and a lot of compute very quickly, and quantum chromodynamics is by far the worst at this. Look up lattice QCD for a relevant keyword.

26. piombisallow ◴[] No.42189704[source]
These usually get split into nodes and scientists can access some nodes at a time. The whole thing isn't working on a single problem.
27. SoftTalker ◴[] No.42189930{4}[source]
> Another consideration is cost. Nuclear weapons are expensive to make, so a design that can get a high yield out of a small amount of fissile material is preferred. Maintenance, and the cost of maintenance, is also relevant. Will the weapon still work in 30 years, and how much money is required to ensure that?

I've seen speculation that Russia's (former Soviet) nuclear weapons are so old and poorly maintained that they probably wouldn't work. Not that anyone wants to find out.

28. sliken ◴[] No.42189996[source]
Well there's a fair bit of chemistry related to the explosions to bring the sub-critical bits together. Time scales are in the nanosecond range. Then as the subcritical bits get closer obviously the nuclear effects start to dominate. Things like berrylium are used to reflect and intensive the chain reaction. All of that is basically just a starter for the fusion reaction. That often involved uranium, lithium deturide, and more plutonium.

So it involves very small time scales, chemistry, fission, fusion, creating and channeling plasmas, high neutron fluxes, extremely high pressures, and of course the exponential release of amazing amounts of energy as matter is literally converted to energy and temperatures exceeding those in the sun.

Then add to all of that is the reality of aging. Explosives can degrade, the structure can weaken (age and radiation), radioactive materials have half lives, etc. What should the replacement rate be? What kind of maintenance would lengthen the useful lives of the weapons? What fraction of the arsenal should work at any given time? How will vibration during delivery impact the above?

Seems like plenty to keep a supercomputer busy.

replies(1): >>42190329 #
29. handfuloflight ◴[] No.42190207{4}[source]
How do you know all this?
30. GemesAS ◴[] No.42190235[source]
Modern weapon codes couple computationally heavy physics like radiation & neutron transport, hydrodynamics, plasma, and chemical physics. While a 1-D or 2-D simulation might not be too heavy in compute often large ensembles of simulations are done for UQ or sensitivity analysis in design work.
31. ethbr1 ◴[] No.42190329[source]
I'd never considered this, but do the high temperatures impose additional computational requirements on the chemical portions?

I'd assume computing atomic behavior at 0K is a lot simpler than at 800,000,000K, over the same time step. ;)

32. ethbr1 ◴[] No.42190387{4}[source]
Small addition: weapon precision has drastically increased since the days of the monster bombs

Less need of 9 megatons against a hardened silo if you have a 1.2 megaton weapon with a 120m CEP.

33. wbl ◴[] No.42190500[source]
And if they really want to know: https://www.energy.gov/nnsa/working-nnsa
34. glial ◴[] No.42191206[source]
I bet the bulk of it is still super-fast Fortran code.