Most active commenters

    ←back to thread

    66 points zdw | 12 comments | | HN request time: 0.606s | source | bottom
    1. throwaway150 ◴[] No.46187883[source]
    > You can’t make anything truly radical with it. By definition, LLMs are trained on what has come before. In addition to being already-discovered territory, existing code is buggy and broken and sloppy and, as anyone who has ever written code knows, absolutely embarrassing to look at.

    I don't understand this argument. I mean the same applies for books. All books teach you what has come before. Nobody says "You can't make anything truly radical with books". Radical things are built by people after reading those books. Why can't people build radical things after learning or after being assisted by LLMs?

    replies(4): >>46188240 #>>46188962 #>>46189004 #>>46189038 #
    2. AdieuToLogic ◴[] No.46188240[source]
    >> You can’t make anything truly radical with it. By definition, LLMs are trained on what has come before. In addition to being already-discovered territory, existing code is buggy and broken and sloppy and, as anyone who has ever written code knows, absolutely embarrassing to look at.

    > I don't understand this argument. I mean the same applies for books. All books teach you what has come before. Nobody says "You can't make anything truly radical with books". Radical things are built by people after reading those books.

    Books share concepts expressed by people understanding those concepts (or purporting to do so) in a manner which is relatable to the reader. This is achievable due to a largely shared common lived experience as both parties are humans.

    In short, people reason, learn, remember, and can relate with each other.

    > Why can't people build radical things after learning ...

    They absolutely can and often do.

    > ... or after being assisted by LLMs?

    Therein lies the problem. LLMs are not assistants.

    They are statistical token (text) document generators. That's it.

    replies(2): >>46188396 #>>46188566 #
    3. bottd ◴[] No.46188396[source]
    > Therein lies the problem. LLMs are not assistants.

    Assisting a person and being an assistant are not synonymous. A cane assists a man while he walks. It is a stick. That's it.

    replies(1): >>46188572 #
    4. wiseowise ◴[] No.46188566[source]
    > They are statistical token (text) document generators. That's it.

    I don’t know why people post this as some kind of slam dunk.

    replies(2): >>46188930 #>>46189472 #
    5. AdieuToLogic ◴[] No.46188572{3}[source]
    > Assisting a person and being an assistant are not synonymous. A cane assists a man while he walks. It is a stick. That's it.

    The difference here is no reasonable person claims a cane can teach someone to walk.

    6. urban_alien ◴[] No.46188930{3}[source]
    Not a slam dunk, just a factual statement
    7. altmanaltman ◴[] No.46188962[source]
    People can absolutely build radical things after learning or being assisted by LLMs. But that's not the meta they're selling. What they are selling with vibe-coding is that you can let the AI build things and don't even need to learn how to code. If someone truly believes that, then I would say the chances of them building something actually useful or radical is close to 0.

    With books, the sell is not that it will create your app and you don't even need to learn to code. The sell is that you will learn to code with this and then use it to build the app (often through a painstaking process).

    > Jensen Huang says kids shouldn't learn to code — they should leave it up to AI

    https://www.tomshardware.com/tech-industry/artificial-intell...

    Which book tells you that you shouldn't learn to code but leave it to the book?

    8. preommr ◴[] No.46189004[source]
    > Why can't people build radical things after learning or after being assisted by LLMs?

    Because that's not how this is being marketed.

    I agree with you completely - the best use case I've found for llms (and I say this as somebody that does generate a lot of code with it) is to use it as a research tool. An instantaneous and powerful solution that fills the gap from the long gone heydays of communities like mailing groups, or Stack overflow where you had the people - the experts and maintainers - that seemingly answered within a few hours on how something works.

    But then that's not enough for all the money that's being fed into this monster. The AI leadership is hell-bent on trying to build a modern day tower of babel (in more ways than one), where there is no thinking or learning - one click and you have an app! Now you can fire your entire software team, and then ask chatgpt what to do next when this breaks the economy.

    9. zmmmmm ◴[] No.46189038[source]
    This is a really dumb point from my point of view.

    People vastly over rate the novelty of software work. The vast majority of the time, it's at least got conceptual similarity to things created before. A lot of the time, being "radically new" is a huge negative. It's a recipe for something nobody can understand or maintain. Almost all software is mild variations on existing things that are assembled to create something new in feature space, but its nearly 100% mind numbingly boring in the methodology of how it is built.

    10. latentsea ◴[] No.46189472{3}[source]
    At the end of the day, it's true? There are times where that suffices, and times where it doesn't.
    replies(1): >>46190140 #
    11. wiseowise ◴[] No.46190140{4}[source]
    It might be true, but it is extremely reductive and used pejoratively.

    > That’s not a book, that’s just a soup of letters! That’s not an art, that’s just paint on a sheet! That’s not an instrument, that’s just a bunch of crap glued together!

    If you’re edgy cynic and treat anything this way, that’s fine. But if you’re singling out LLMs just because you don’t like them, then you’re a hypocrite.

    replies(1): >>46193155 #
    12. brazukadev ◴[] No.46193155{5}[source]
    If you think calling a calculator a calculator is offensive to the point of calling someone a cynic and hypocrite you might be a bit too invested