←back to thread

1479 points sandslash | 2 comments | | HN request time: 0.414s | source
Show context
tudorizer ◴[] No.44319472[source]
95% terrible expression of the landscape, 5% neatly dumbed down analogies.

English is a terrible language for deterministic outcomes in complex/complicated systems. Vibe coders won't understand this until they are 2 years into building the thing.

LLMs have their merits and he sometimes aludes to them, although it almost feels accidental.

Also, you don't spend years studying computer science to learn the language/syntax, but rather the concepts and systems, which don't magically disappear with vibe coding.

This whole direction is a cheeky Trojan horse. A dramatic problem, hidden in a flashy solution, to which a fix will be upsold 3 years from now.

I'm excited to come back to this comment in 3 years.

replies(10): >>44319579 #>>44319777 #>>44320017 #>>44320108 #>>44320322 #>>44320523 #>>44320547 #>>44320613 #>>44320728 #>>44320743 #
diggan ◴[] No.44319579[source]
> English is a terrible language for deterministic outcomes in complex/complicated systems

I think that you seem to be under the impression that Karpathy somehow alluded to or hinted at that in his talk, which indicates you haven't actually watched the talk, which makes your first point kind of weird.

I feel like one of the stronger points he made, was that you cannot treat the LLMs as something they're explicitly not, so why would anyone expect deterministic outcomes from them?

He's making the case for coding with LLMs, not letting the LLMs go by themselves writing code ("vibe coding"), and understanding how they work before attempting to do so.

replies(1): >>44319869 #
tudorizer ◴[] No.44319869[source]
I watched the entire talk, quite carefully. He explicitly states how excited he was about his tweet mentioning English.

The disclaimer you mention was indeed mentioned, although it's "in one ear, out the other" with most of his audience.

If I give you a glazed donut with a brief asterisk about how sugar can cause diabetes will it stop you from eating the donut?

You also expect deterministic outcomes when making analogies with power plants and fabs.

replies(3): >>44319978 #>>44320055 #>>44320091 #
diggan ◴[] No.44320055[source]
I think this is the moment you're referring to? https://youtu.be/LCEmiRjPEtQ?si=QWkimLapX6oIqAjI&t=236

> maybe you've seen a lot of GitHub code is not just like code anymore there's a bunch of like English interspersed with code and so I think kind of there's a growing category of new kind of code so not only is it a new programming paradigm it's also remarkable to me that it's in our native language of English and so when this blew my mind a few uh I guess years ago now I tweeted this and um I think it captured the attention of a lot of people and this is my currently pinned tweet uh is that remarkably we're now programming computers in English now

I agree that it's remarkable that you can tell a computer "What is the biggest city in Maresme?" and it tries to answer that question. I don't think he's saying "English is the best language to make complicated systems uncomplicated with", or anything to that effect. Just like I still think "Wow, this thing is fucking flying" every time I sit onboard a airplane, LLMs are kind of incredible in some ways, yet so "dumb" in some other ways. It sounds to me like he's sharing a similar sentiment but about LLMs.

> although it's "in one ear, out the other" with most of his audience.

Did you talk with them? Otherwise this is just creating an imaginary argument against some people you just assume they didn't listen.

> If I give you a glazed donut with a brief asterisk about how sugar can cause diabetes will it stop you from eating the donut?

If I wanted to eat a donut at that point, I guess I'd eat it anyways? But my aversion to risk (or rather the lack of it) tend to be non-typical.

What does my answer mean in the context of LLMs and non-determinism?

> You also expect deterministic outcomes when making analogies with power plants and fabs.

Are you saying that the analogy should be deterministic or that power plants and fabs are deterministic? Because I don't understand if the former, and the latter really isn't deterministic by any definition I recognize that word by.

replies(2): >>44320184 #>>44320290 #
tudorizer ◴[] No.44320184[source]
> Did you talk with them? Otherwise this is just creating an imaginary argument against some people you just assume they didn't listen.

I have, unfortunately. Start-up founders, managers, investors who taunt the need for engineers because "AI can fix it".

Don't get me wrong, there are plenty of "stochastic parrot" engineers even without AI, but still, not enough to make blanket statements.

replies(1): >>44320239 #
1. diggan ◴[] No.44320239[source]
That's a lot of people to talk to in a day more or less, since the talk happened. Were they all there and you too, or you all had a watch party or something?

Still, what's the outcome of our "glazed donut" argument, you got me curious what that would lead to. Did I die of diabetes?

replies(1): >>44320666 #
2. jbeninger ◴[] No.44320666[source]
I think the analogy is that vibe coding is bad for you but feels good. Like a donut.

But I'd say the real situation is more akin to "if you eat this donut quickly, you might get diabetes, but if you eat it slowly, it's fine", which is a bad analogy, but a bit more accurate.