←back to thread

LLM Inevitabilism

(tomrenner.com)
1612 points SwoopsFromAbove | 3 comments | | HN request time: 0.674s | source
Show context
mg ◴[] No.44568158[source]
In the 90s a friend told me about the internet. And that he knows someone who is in a university and has access to it and can show us. An hour later, we were sitting in front of a computer in that university and watched his friend surfing the web. Clicking on links, receiving pages of text. Faster than one could read. In a nice layout. Even with images. And links to other pages. We were shocked. No printing, no shipping, no waiting. This was the future. It was inevitable.

Yesterday I wanted to rewrite a program to use a large library that would have required me to dive deep down into the documentation or read its code to tackle my use case. As a first try, I just copy+pasted the whole library and my whole program into GPT 4.1 and told it to rewrite it using the library. It succeeded at the first attempt. The rewrite itself was small enough that I could read all code changes in 15 minutes and make a few stylistic changes. Done. Hours of time saved. This is the future. It is inevitable.

PS: Most replies seem to compare my experience to experiences that the responders have with agentic coding, where the developer is iteratively changing the code by chatting with an LLM. I am not doing that. I use a "One prompt one file. No code edits." approach, which I describe here:

https://www.gibney.org/prompt_coding

replies(58): >>44568182 #>>44568188 #>>44568190 #>>44568192 #>>44568320 #>>44568350 #>>44568360 #>>44568380 #>>44568449 #>>44568468 #>>44568473 #>>44568515 #>>44568537 #>>44568578 #>>44568699 #>>44568746 #>>44568760 #>>44568767 #>>44568791 #>>44568805 #>>44568823 #>>44568844 #>>44568871 #>>44568887 #>>44568901 #>>44568927 #>>44569007 #>>44569010 #>>44569128 #>>44569134 #>>44569145 #>>44569203 #>>44569303 #>>44569320 #>>44569347 #>>44569391 #>>44569396 #>>44569574 #>>44569581 #>>44569584 #>>44569621 #>>44569732 #>>44569761 #>>44569803 #>>44569903 #>>44570005 #>>44570024 #>>44570069 #>>44570120 #>>44570129 #>>44570365 #>>44570482 #>>44570537 #>>44570585 #>>44570642 #>>44570674 #>>44572113 #>>44574176 #
shaky-carrousel ◴[] No.44569732[source]
Hours of time saved, and you learned nothing in the process. You are slowly becoming a cog in the LLM process instead of an autonomous programmer. You are losing autonomy and depending more and more on external companies. And one day will come that, with all that power, they'll set whatever price or conditions they want. And you will accept. That's the future. And it's not inevitable.
replies(4): >>44569772 #>>44569796 #>>44569813 #>>44569871 #
baxtr ◴[] No.44569871[source]
Did you build the house you live in? Did you weave your own clothes or grow your own food?

We all depend on systems others built. Determining when that trade-off is worthwhile and recognizing when convenience turns into dependence are crucial.

replies(2): >>44569942 #>>44570007 #
1. Draiken ◴[] No.44570007[source]
We're talking about a developer here so this analogy does not apply. If a developer doesn't actually develop anything, what exactly is he?

> We all depend on systems others built. Determining when that trade-off is worthwhile and recognizing when convenience turns into dependence are crucial.

I agree with this and that's exactly what OP is saying: you're now a cog in the LLM pipeline and nothing else.

If we lived in a saner world this would be purely a net positive but in our current society it simply means we'll get replaced for the cheaper alternative the second it becomes viable, making any dependence to it extremely risky.

It's not only for individuals too. What happens when our governments are now dependent on LLMs from these private corporations to function and they start the enshitification phase?

replies(1): >>44570753 #
2. sekai ◴[] No.44570753[source]
> We're talking about a developer here so this analogy does not apply. If a developer doesn't actually develop anything, what exactly is he?

A problem solver

replies(1): >>44570860 #
3. shaky-carrousel ◴[] No.44570860[source]
More like a trouble maker.