←back to thread

169 points mattmarcus | 3 comments | | HN request time: 0.614s | source
Show context
nativeit ◴[] No.43613007[source]
We’re all just elementary particles being clumped together in energy gradients, therefore my little computer project is sentient—this is getting absurd.
replies(4): >>43613040 #>>43613242 #>>43613636 #>>43613675 #
nativeit ◴[] No.43613040[source]
Sorry, this is more about the discussion of this article than the article itself. The moving goal posts that acolytes use to declare consciousness are becoming increasingly cult-y.
replies(2): >>43613083 #>>43613334 #
wongarsu ◴[] No.43613083[source]
We spent 40 years moving the goal posts on what constitutes AI. Now we seem to have found an AI worthy of that title and instead start moving the goal posts on "consciousness", "understanding" and "intelligence".
replies(7): >>43613139 #>>43613150 #>>43613171 #>>43613259 #>>43613347 #>>43613368 #>>43613468 #
1. cayley_graph ◴[] No.43613139[source]
Indeed, science is a process of discovery and adjusting goals and expectations. It is not a mountain to be summited. It is highly telling that the LLM boosters do not understand this. Those with a genuine interest in pushing forward our understanding of cognition do.
replies(1): >>43613371 #
2. delusional ◴[] No.43613371[source]
They believe that once they reach this summit everything else will be trivial problems that can be posed to the almighty AI. It's not that they don't understand the process, it's that they think AI is going to disrupt that process.

They literally believe that the AI will supersede the scientific process. It's crypto shit all over again.

replies(1): >>43613440 #
3. redundantly ◴[] No.43613440[source]
Well, if that summit were reached and AI is able to improve itself trivially, I'd be willing to cede that they've reached their goal.

Anything less than that, meh.