Most active commenters
  • shaky-carrousel(10)

←back to thread

LLM Inevitabilism

(tomrenner.com)
1611 points SwoopsFromAbove | 26 comments | | HN request time: 1.408s | source | bottom
Show context
mg ◴[] No.44568158[source]
In the 90s a friend told me about the internet. And that he knows someone who is in a university and has access to it and can show us. An hour later, we were sitting in front of a computer in that university and watched his friend surfing the web. Clicking on links, receiving pages of text. Faster than one could read. In a nice layout. Even with images. And links to other pages. We were shocked. No printing, no shipping, no waiting. This was the future. It was inevitable.

Yesterday I wanted to rewrite a program to use a large library that would have required me to dive deep down into the documentation or read its code to tackle my use case. As a first try, I just copy+pasted the whole library and my whole program into GPT 4.1 and told it to rewrite it using the library. It succeeded at the first attempt. The rewrite itself was small enough that I could read all code changes in 15 minutes and make a few stylistic changes. Done. Hours of time saved. This is the future. It is inevitable.

PS: Most replies seem to compare my experience to experiences that the responders have with agentic coding, where the developer is iteratively changing the code by chatting with an LLM. I am not doing that. I use a "One prompt one file. No code edits." approach, which I describe here:

https://www.gibney.org/prompt_coding

replies(58): >>44568182 #>>44568188 #>>44568190 #>>44568192 #>>44568320 #>>44568350 #>>44568360 #>>44568380 #>>44568449 #>>44568468 #>>44568473 #>>44568515 #>>44568537 #>>44568578 #>>44568699 #>>44568746 #>>44568760 #>>44568767 #>>44568791 #>>44568805 #>>44568823 #>>44568844 #>>44568871 #>>44568887 #>>44568901 #>>44568927 #>>44569007 #>>44569010 #>>44569128 #>>44569134 #>>44569145 #>>44569203 #>>44569303 #>>44569320 #>>44569347 #>>44569391 #>>44569396 #>>44569574 #>>44569581 #>>44569584 #>>44569621 #>>44569732 #>>44569761 #>>44569803 #>>44569903 #>>44570005 #>>44570024 #>>44570069 #>>44570120 #>>44570129 #>>44570365 #>>44570482 #>>44570537 #>>44570585 #>>44570642 #>>44570674 #>>44572113 #>>44574176 #
1. shaky-carrousel ◴[] No.44569732[source]
Hours of time saved, and you learned nothing in the process. You are slowly becoming a cog in the LLM process instead of an autonomous programmer. You are losing autonomy and depending more and more on external companies. And one day will come that, with all that power, they'll set whatever price or conditions they want. And you will accept. That's the future. And it's not inevitable.
replies(4): >>44569772 #>>44569796 #>>44569813 #>>44569871 #
2. chii ◴[] No.44569772[source]
> and you learned nothing in the process.

why do you presume the person wanted to learn something, rather than to get the work done asap? May be they're not interested in learning, or may be they have something more important to do, and saving this time is a life saver?

> You are losing autonomy and depending more and more on external companies

do you also autonomously produce your own clean water, electricity, gas and food? Or do you rely on external companies to provision all of those things?

replies(1): >>44569963 #
3. bt1a ◴[] No.44569796[source]
these 3090s are mine. hands off!
4. 77pt77 ◴[] No.44569813[source]
> Hours of time saved, and you learned nothing in the process

Point and click "engineer" 2.0

We all know this.

Eventually someone has to fix the mess and it won't be him. He will be management by then.

replies(1): >>44571013 #
5. baxtr ◴[] No.44569871[source]
Did you build the house you live in? Did you weave your own clothes or grow your own food?

We all depend on systems others built. Determining when that trade-off is worthwhile and recognizing when convenience turns into dependence are crucial.

replies(2): >>44569942 #>>44570007 #
6. shaky-carrousel ◴[] No.44569942[source]
Did you write your own letters? Did you write your own arguments? Did you write your own code? I do, and don't depend on systems other built to do so. And losing the ability of keep doing so is a pretty big trade-off, in my opinion.
replies(3): >>44570262 #>>44570733 #>>44571102 #
7. shaky-carrousel ◴[] No.44569963[source]
The pretty big difference is that I'm not easily able to produce my electricity or food. But I'm easily able to produce my code. We are losing autonomy we already have, just for pure laziness, and it will bite us.
replies(1): >>44570348 #
8. Draiken ◴[] No.44570007[source]
We're talking about a developer here so this analogy does not apply. If a developer doesn't actually develop anything, what exactly is he?

> We all depend on systems others built. Determining when that trade-off is worthwhile and recognizing when convenience turns into dependence are crucial.

I agree with this and that's exactly what OP is saying: you're now a cog in the LLM pipeline and nothing else.

If we lived in a saner world this would be purely a net positive but in our current society it simply means we'll get replaced for the cheaper alternative the second it becomes viable, making any dependence to it extremely risky.

It's not only for individuals too. What happens when our governments are now dependent on LLMs from these private corporations to function and they start the enshitification phase?

replies(1): >>44570753 #
9. djray ◴[] No.44570262{3}[source]
There seems to be a mistaken thought that having an AI (or indeed someone else) help you achieve a task means you aren't learning anything. This is reductionist. I suggest instead that it's about degrees of autonomy. The person you're responding to made a choice to get the AI to help integrate a library. They chose NOT to have the AI edit the files itself; they rather spent time reading through the changes and understanding the integration points, and tweaking the code to make it their own. This is much different to vibe coding.

I do a similar loop with my use of AI - I will upload code to Gemini 2.5 Pro, talk through options and assumptions, and maybe get it to write some or all of the next step, or to try out different approaches to a refactor. Integrating any code back into the original source is never copy-and-paste, and that's where the learning is. For example, I added Dexie (a library/wrapper for accessing IndexedDB) to a browser extension project the other day, and the AI helped me get started with a minimal amount of initial knowledge, yet I learned a lot about Dexie and have been able to expand upon the code myself since. If I were on my own, I would probably have barrelled ahead and just used IndexedDB directly, resulting in a lot more boilerplate code and time spent doing busywork. It's this sort of friction reduction that I find most liberating about AI. Trying out a new library isn't a multi-hour slog; instead, you can sample it and possibly reject it as unsuitable almost immediately without having to waste a lot of time on R&D. In my case, I didn't learn 'raw' IndexedDB, but instead I got the job done with a library offering a more suitable level of abstraction, and saved hours in the process.

This isn't lazy or giving up the opportunity to learn, it's simply optimising your time.

The "not invented here" syndrome is something I kindly suggest you examine, as you may find you are actually limiting your own innovation by rejecting everything that you can't do yourself.

replies(2): >>44570358 #>>44570987 #
10. hackinthebochs ◴[] No.44570348{3}[source]
Reducing friction, increasing the scope of what is possible given a unit of effort, that is just increasing autonomy.
replies(2): >>44570731 #>>44570941 #
11. shaky-carrousel ◴[] No.44570358{4}[source]
It's not reductionist, it's a fact. If you, instead of learning Python, ask an LLM to code you something in Python, you won't learn a line of Python in the process. Even if you read the produced code from beginning to end. Because (and honestly I'm surprised I have to point out this, here of all places) you learn by writing code, not by reading code.
replies(1): >>44570689 #
12. rybosome ◴[] No.44570689{5}[source]
I encourage you to try this yourself and see how you feel.

Recently I used an LLM to help me build a small application in Rust, having never used it before (though I had a few years of high performance C++ experience).

The LLM wrote most of the code, but it was no more than ~100 lines at a time, then I’d tweak, insert, commit, plan the next feature. I hand-wrote very little, but I was extremely involved in the design and layout of the app.

Without question, I learned a lot about Rust. I used tokio’s async runtime, their mpsc channels, and streams to make a high performance crawler that worked really well for my use case.

If I needed to write Rust without an LLM now, I believe I could do it - though it would be slower and harder.

There’s definitely a “turn my brain off and LLM for me” way to use these tools, but it is reductive to state that ALL usage of such tools is like this.

replies(1): >>44570761 #
13. shaky-carrousel ◴[] No.44570731{4}[source]
I'm afraid that "friction" is your brain learning. Depending on a few AI companies to save you the effort of learning is not increasing autonomy.
replies(1): >>44572324 #
14. sekai ◴[] No.44570733{3}[source]
> Did you write your own letters? Did you write your own arguments? Did you write your own code? I do, and don't depend on systems other built to do so. And losing the ability of keep doing so is a pretty big trade-off, in my opinion.

Gatekeeping at it's finest, you're not a "true" software engineer if you're not editing the kernel on your own, locked in in a cubicle, with no external help.

replies(1): >>44570852 #
15. sekai ◴[] No.44570753{3}[source]
> We're talking about a developer here so this analogy does not apply. If a developer doesn't actually develop anything, what exactly is he?

A problem solver

replies(1): >>44570860 #
16. shaky-carrousel ◴[] No.44570761{6}[source]
Of course you have learned a lot about rust. What you haven't learned is to program in rust. Try, a month from now, to write that application in rust from scratch, without any LLM help. If you can, then you truly learned to program in rust. If you don't, then what you learned is just generic trivia about rust.
17. shaky-carrousel ◴[] No.44570852{4}[source]
That... Doesn't even begin to make sense. Defending the ability to code without relying on three big corps is... absolutely unrelated with gate-keeping.
18. shaky-carrousel ◴[] No.44570860{4}[source]
More like a trouble maker.
19. jplusequalt ◴[] No.44570941{4}[source]
>Reducing friction, increasing the scope of what is possible given a unit of effort, that is just increasing autonomy

I, with a car, can drive to other side of the US and back. I am able to travel to and from to places in a way my ancestors never could.

However, the price our society had to pay for this newfound autonomy was that we needed to sacrifice land for highways, move further away from our workplaces, deal with traffic, poison our breathing air with smog, decrease investments into public transportation, etc.

I think people are too gung-ho on new technologies in the tech space without considering the negatives--in part because software developers are egotistical and like to think they know what's best for society. But I wish for once they'd consider the sacrifices we'll have to make as a society by adopting the shiny new toy.

20. bluefirebrand ◴[] No.44570987{4}[source]
> The "not invented here" syndrome is something I kindly suggest you examine

I think AI is leading to a different problem. The "nothing invented here" syndrome

Using LLMs is not the same as offloading the understanding of some code to external library maintainers.

It is offloading the understanding of your own code, the code you are supposed to be the steward of, to the LLM

21. bluefirebrand ◴[] No.44571013[source]
> We all know this

Unfortunately, reading this thread and many other comments on similar articles, it seems like many of us have no clue about this

We are in for a rough ride until we figure this out

22. danenania ◴[] No.44571102{3}[source]
Unless you're writing machine code, you aren't really writing your own code either. You're giving high level instructions, which depend on many complex systems built by thousands of engineers to actually run.
replies(1): >>44574926 #
23. hackinthebochs ◴[] No.44572324{5}[source]
Learning != autonomy. Increasing one's action-space increases autonomy. Learning is only indirectly related. Depending on private companies is limiting, but using LLMs isn't inherently tied to private companies.
replies(1): >>44574953 #
24. shaky-carrousel ◴[] No.44574926{4}[source]
Yes, and my computer is using electricity I'm not directly generating with a bike, but all that is besides the point.
replies(1): >>44576107 #
25. shaky-carrousel ◴[] No.44574953{6}[source]
Ah, learning == autonomy, and using competitive LLMs is very much tied to private companies.
26. danenania ◴[] No.44576107{5}[source]
Yeah you depend on many layers of infrastructure as well for that electricity. It’s exactly the point.

All the criticisms you level at people coding with LLMs apply just as much to your artisanal hand-sculpted code that you’re so proud of.