The last thing we need is hallucinations fucking up the more grounded astrophysics. I'm not saying that is what is happening, I just worry about stuff like this. AI causing us to bark up the wrong tree, and so forth.
The last thing we need is hallucinations fucking up the more grounded astrophysics. I'm not saying that is what is happening, I just worry about stuff like this. AI causing us to bark up the wrong tree, and so forth.
What if we had that view with microscopes, back when?
I see the point being made above fully. If ai takes over it's because we are every day it seems like slowly placing that faith.
It's our wow. It's the future generations taken for granted.
"Much more in-depth" ways now just "the way".
Like telescopes?
But yes, like telescopes. Or microscopes. Those still bind us to using our built in sensors that we 'trust'.
Then we obviously get into radio telescopes, or down to electron microscopes, etc and we start having to believe in the tech to get our new found understandings.
My mental hesitation lay in trusting AI to get to that level of belief -- if/when that happens, what do we really know or trust?
I'm really not sure what you're getting at here, but you definitely seem to be confusing generative AI here. What's being discussed here is not generative AI. It's just a very refined algo searching for patterns in images. This is not "artist conception" type of content like the image of the black hole. So until you accept the difference, you're just spinning your wheels