←back to thread

566 points PaulHoule | 1 comments | | HN request time: 0.211s | source
Show context
EigenLord ◴[] No.44497385[source]
Diffusion is just the logically most optimally behavior for searching massively parallel spaces without informed priors. We need to think beyond language modeling however and start to view this in terms of drug discovery etc. A good diffusion model + the laws of chemistry could be god-tier. I think language modeling has the AI community's in its grips right now and they aren't seeing the applications of the same techniques to real world problems elsewhere.
replies(2): >>44497405 #>>44497430 #
1. dawnofdusk ◴[] No.44497405[source]
Actually in most deep learning schemes for science adding in the "laws of nature" as constraints makes things much worse. For example, all the best weather prediction models utilize basically zero fluid dynamics. Even though a) global weather can be in principle predicted by using the Navier-Stokes equations and b) deep learning models can be used to approximately evaluate the Navier-Stokes equations, we now know that incorporating physics into these models is mostly a mistake.

The intuitive reason might be that unconstrained optimization is easier than constrained optimization, particularly in high dimensions, but no one really knows the real reason. It may be that we are not yet at the end of the "bigger is better" regime, and at the true frontier we must add the laws of natures to eke out the last remaining bits of performance possible.