←back to thread

110 points PaulHoule | 1 comments | | HN request time: 0.208s | source
Show context
karim79 ◴[] No.43552251[source]
I truly hope that the common theme of the likes of "JWST Just Found Something Which Should Not Exist" etc will not be augmented by stuff like "we used AI(tm) to figure out X, Y, Z".

The last thing we need is hallucinations fucking up the more grounded astrophysics. I'm not saying that is what is happening, I just worry about stuff like this. AI causing us to bark up the wrong tree, and so forth.

replies(6): >>43552334 #>>43552343 #>>43552572 #>>43553400 #>>43554487 #>>43556476 #
1. NitpickLawyer ◴[] No.43554487[source]
> The last thing we need is hallucinations fucking up the more grounded astrophysics.

You're thinking of the wrong ML. Generative models "hallucinate" and it's as much a feature as it's a bug. ML in astrophysics is not generative. They use it for flagging, "binning" data and in general (simplified) classification.