←back to thread

1481 points sandslash | 7 comments | | HN request time: 1.518s | source | bottom
Show context
tudorizer ◴[] No.44319472[source]
95% terrible expression of the landscape, 5% neatly dumbed down analogies.

English is a terrible language for deterministic outcomes in complex/complicated systems. Vibe coders won't understand this until they are 2 years into building the thing.

LLMs have their merits and he sometimes aludes to them, although it almost feels accidental.

Also, you don't spend years studying computer science to learn the language/syntax, but rather the concepts and systems, which don't magically disappear with vibe coding.

This whole direction is a cheeky Trojan horse. A dramatic problem, hidden in a flashy solution, to which a fix will be upsold 3 years from now.

I'm excited to come back to this comment in 3 years.

replies(10): >>44319579 #>>44319777 #>>44320017 #>>44320108 #>>44320322 #>>44320523 #>>44320547 #>>44320613 #>>44320728 #>>44320743 #
1. oc1 ◴[] No.44320017[source]
AI is all about context window. If you figured out the context problem, you will see that all these "AI is bullshit, it doesn't work and can't produce working code" goes away. Same for everything else.
replies(2): >>44320121 #>>44320292 #
2. tudorizer ◴[] No.44320121[source]
Working code or not is irelevant. Heck, even human-in-loop (Tony-in-the-Iron-Man) is not actively the point. If we're going into "it's all about" territory then it's all about:

- training data - approximation of the desired outcome

Neither support a good direction for the complexity of some of the system around us, most of which require dedicated language. Imagine doing calculus or quantum physics in English. Novels of words would barely suffice.

So a context window as big as the training data itself?

What if the training data is faulty?

I'm confident you understand that working code or not doesn't matter in this analogy. Neither does LLMs reaching out for the right tool.

LLMs has its merits. Replacing concrete systems that require a formal language and grammar is not.

`1 + 1 = 2` because that's how maths works, not because of deja vú.

replies(1): >>44320573 #
3. cobertos ◴[] No.44320292[source]
Untrue. I find problems with niche knowledge, heavy math, and/or lack of good online resources to be troublesome for AI. Examples so far I've found of consistent struggle points are shaders, parsers, and streams (in Nodejs at least)

Context window will solve a class of problems, but will not solve all problems with AI.

replies(1): >>44326648 #
4. gardenhedge ◴[] No.44320573[source]
Tony is iron man, not in him
replies(1): >>44320649 #
5. tudorizer ◴[] No.44320649{3}[source]
Sure, I wasn't sure how to call the robot layer. Is is "Iron Main Suit"?
replies(1): >>44327647 #
6. diggan ◴[] No.44326648[source]
I think probably the biggest help I've got from LLMs is things that are "niche" knowledge, for me. Things like "I need a function heavy in math that when I give X and Y, it returns Z" I could have struggled with for days sometimes when I'm writing games for fun, but with LLMs I can have it done and move on in a couple of minutes, the most time consuming part is writing the tests and overall testing, but I no longer spend days just trying to understand enough math to actually write the thing.
7. gardenhedge ◴[] No.44327647{4}[source]
It's just a suit or armour. There are many and are referred to as Mark I, II, III etc