←back to thread

108 points bertman | 1 comments | | HN request time: 0.238s | source
Show context
ninetyninenine ◴[] No.43822709[source]
Can he prove what he says? The foundation of his argument rests on hand waves and vague definitions on what a mind is and what a theory is that it ultimately goes nowhere. Then he makes a claim and doesn’t back it up.

I have a new concept for the author to understand: proof. He doesn’t have any.

Let me tell you something about LLMs. We don’t understand what’s going on internally. LLMs say things that are true and untrue just like humans do and we don’t know if what it says is a general lack of theory building ability or if it’s lying or if it has flickers of theory building and becomes delusional at other times. We literally do not know. The whole thing is a black box that we can only poke at.

What ticks me off is all these geniuses who write these blog posts with the authority of a know it all when clearly we have no fucking clue about what’s going on.

Even more genius is when he uses concepts like “mind” and “theory” building the most hand wavy disagreed upon words in existence and rest his foundations on these words when no people ever really agree on what these fucking things are.

You can muse philosophically all you want and in any direction but it’s all bs without definitive proof. It’s like religion. How people made up shit about nature because they didn’t truly understand nature. This is the idiocy with this article. It’s building a religious following and making wild claims without proof.

replies(2): >>43825952 #>>43855952 #
1. stevenhuang ◴[] No.43855952[source]
Well said. I wouldn't be surprised if at the root of it for these people is motivated reasoning from belief there is something special about the mind that machines cannot possess.

Akin to the wayward belief animals can't feel pain, only humans do. Which we now realize is wrong, actually some animals understand pain and suffer just as much as humans can.

Would not be surprised if we come to a similar realization for LLMs and our understanding what it means to reason.