←back to thread

170 points PaulHoule | 4 comments | | HN request time: 0.878s | source
Show context
measurablefunc ◴[] No.45120049[source]
There is a formal extensional equivalence between Markov chains & LLMs but the only person who seems to be saying anything about this is Gary Marcus. He is constantly making the point that symbolic understanding can not be reduced to a probabilistic computation regardless of how large the graph gets it will still be missing basic stuff like backtracking (which is available in programming languages like Prolog). I think that Gary is right on basically all counts. Probabilistic generative models are fun but no amount of probabilistic sequence generation can be a substitute for logical reasoning.
replies(16): >>45120249 #>>45120259 #>>45120415 #>>45120573 #>>45120628 #>>45121159 #>>45121215 #>>45122702 #>>45122805 #>>45123808 #>>45123989 #>>45125478 #>>45125935 #>>45129038 #>>45130942 #>>45131644 #
1. boznz ◴[] No.45120249[source]
logical reasoning is also based on probability weights, most of the time that probability is so close to 100% that it can be assumed to be true without consequence.
replies(1): >>45120935 #
2. AaronAPU ◴[] No.45120935[source]
Stunningly, though I have been saying this for 20 years I’ve never come across someone else mention it until now.
replies(1): >>45125963 #
3. nprateem ◴[] No.45125963[source]
Glue sticks.

Pepperoni falls off pizza.

Therefore to keep it in place, stick it with glue...

Not stunned by this reductionist take.

replies(1): >>45153196 #
4. AaronAPU ◴[] No.45153196{3}[source]
Those words contain far more relations than just “sticks” — the reduction is in your framing.