←back to thread

1479 points sandslash | 2 comments | | HN request time: 0s | source
Show context
tudorizer ◴[] No.44319472[source]
95% terrible expression of the landscape, 5% neatly dumbed down analogies.

English is a terrible language for deterministic outcomes in complex/complicated systems. Vibe coders won't understand this until they are 2 years into building the thing.

LLMs have their merits and he sometimes aludes to them, although it almost feels accidental.

Also, you don't spend years studying computer science to learn the language/syntax, but rather the concepts and systems, which don't magically disappear with vibe coding.

This whole direction is a cheeky Trojan horse. A dramatic problem, hidden in a flashy solution, to which a fix will be upsold 3 years from now.

I'm excited to come back to this comment in 3 years.

replies(10): >>44319579 #>>44319777 #>>44320017 #>>44320108 #>>44320322 #>>44320523 #>>44320547 #>>44320613 #>>44320728 #>>44320743 #
diggan ◴[] No.44319579[source]
> English is a terrible language for deterministic outcomes in complex/complicated systems

I think that you seem to be under the impression that Karpathy somehow alluded to or hinted at that in his talk, which indicates you haven't actually watched the talk, which makes your first point kind of weird.

I feel like one of the stronger points he made, was that you cannot treat the LLMs as something they're explicitly not, so why would anyone expect deterministic outcomes from them?

He's making the case for coding with LLMs, not letting the LLMs go by themselves writing code ("vibe coding"), and understanding how they work before attempting to do so.

replies(1): >>44319869 #
tudorizer ◴[] No.44319869[source]
I watched the entire talk, quite carefully. He explicitly states how excited he was about his tweet mentioning English.

The disclaimer you mention was indeed mentioned, although it's "in one ear, out the other" with most of his audience.

If I give you a glazed donut with a brief asterisk about how sugar can cause diabetes will it stop you from eating the donut?

You also expect deterministic outcomes when making analogies with power plants and fabs.

replies(3): >>44319978 #>>44320055 #>>44320091 #
diggan ◴[] No.44320055{3}[source]
I think this is the moment you're referring to? https://youtu.be/LCEmiRjPEtQ?si=QWkimLapX6oIqAjI&t=236

> maybe you've seen a lot of GitHub code is not just like code anymore there's a bunch of like English interspersed with code and so I think kind of there's a growing category of new kind of code so not only is it a new programming paradigm it's also remarkable to me that it's in our native language of English and so when this blew my mind a few uh I guess years ago now I tweeted this and um I think it captured the attention of a lot of people and this is my currently pinned tweet uh is that remarkably we're now programming computers in English now

I agree that it's remarkable that you can tell a computer "What is the biggest city in Maresme?" and it tries to answer that question. I don't think he's saying "English is the best language to make complicated systems uncomplicated with", or anything to that effect. Just like I still think "Wow, this thing is fucking flying" every time I sit onboard a airplane, LLMs are kind of incredible in some ways, yet so "dumb" in some other ways. It sounds to me like he's sharing a similar sentiment but about LLMs.

> although it's "in one ear, out the other" with most of his audience.

Did you talk with them? Otherwise this is just creating an imaginary argument against some people you just assume they didn't listen.

> If I give you a glazed donut with a brief asterisk about how sugar can cause diabetes will it stop you from eating the donut?

If I wanted to eat a donut at that point, I guess I'd eat it anyways? But my aversion to risk (or rather the lack of it) tend to be non-typical.

What does my answer mean in the context of LLMs and non-determinism?

> You also expect deterministic outcomes when making analogies with power plants and fabs.

Are you saying that the analogy should be deterministic or that power plants and fabs are deterministic? Because I don't understand if the former, and the latter really isn't deterministic by any definition I recognize that word by.

replies(2): >>44320184 #>>44320290 #
1. tudorizer ◴[] No.44320290{4}[source]
> That's a lot of people to talk to in a day more or less, since the talk happened. Were they all there and you too, or you all had a watch party or something?

hehe, I wish.

The topics in the talk are not new. They have been explored and pondered up for quite a while now.

As for the outcome of the donut experiment, I don't know. You tell me. Apply it repeatedly at a big scale and see if you should alter the initial offer for best outcomes (as relative as "best" might be).

replies(1): >>44320363 #
2. diggan ◴[] No.44320363[source]
> The topics in the talk are not new.

Sure, but your initial dismissal ("95% X, 5% Y") is literally about this talk no? And when you say 'it's "in one ear, out the other" with most of his audience' that's based on some previous experience, rather than the talk itself? I guess I got confused what applied to what event.

> As for the outcome of the donut experiment, I don't know. You tell me. Apply it repeatedly at a big scale and see if you should alter the initial offer for best outcomes (as relative as "best" might be).

Maybe I'm extra slow today, how does this tie into our conversation so far? Does it have anything to do with determinism or what was the idea behind bringing it up? I'm afraid you're gonna have to spell it out for me, sorry about that :)