←back to thread

169 points mattmarcus | 3 comments | | HN request time: 0.451s | source
Show context
nativeit ◴[] No.43613007[source]
We’re all just elementary particles being clumped together in energy gradients, therefore my little computer project is sentient—this is getting absurd.
replies(4): >>43613040 #>>43613242 #>>43613636 #>>43613675 #
nativeit ◴[] No.43613040[source]
Sorry, this is more about the discussion of this article than the article itself. The moving goal posts that acolytes use to declare consciousness are becoming increasingly cult-y.
replies(2): >>43613083 #>>43613334 #
wongarsu ◴[] No.43613083[source]
We spent 40 years moving the goal posts on what constitutes AI. Now we seem to have found an AI worthy of that title and instead start moving the goal posts on "consciousness", "understanding" and "intelligence".
replies(7): >>43613139 #>>43613150 #>>43613171 #>>43613259 #>>43613347 #>>43613368 #>>43613468 #
bluefirebrand ◴[] No.43613171[source]
> Now we seem to have found an AI worthy of that title and instead start moving the goal posts on "consciousness", "understanding" and "intelligence".

We didn't "find" AI, we invented systems that some people want to call AI, and some people aren't convinced it meets the bar

It is entirely reasonable for people to realize we set the bar too low when it is a bar we invented

replies(1): >>43613243 #
1. darkerside ◴[] No.43613243[source]
What should the bar be? Should it be higher than it is for the average human? Or even the least intelligent human?
replies(2): >>43613462 #>>43613561 #
2. joe8756438 ◴[] No.43613462[source]
there is no such bar.

We don’t even have a good way to quantify human ability. The idea that we could suddenly develop a technique to quantify human ability because we now have a piece of technology that would benefit from that quantification is absurd.

That doesn’t mean we shouldn’t try to measure the ability of an LLM. But it does mean that the techniques used to quantify an LLMs ability are not something that can be applied to humans outside of narrow focus areas.

3. bluefirebrand ◴[] No.43613561[source]
Personally I don't care what the bar is, honestly

Call it AI, call it LLMs, whatever

Just as long as we continue to recognize that it is a tool that humans can use, and don't start trying to treat it as a human, or as a life, and I won't complain

I'm saving my anger for when idiots start to argue that LLMs are alive and deserve human rights