←back to thread

LLM Inevitabilism

(tomrenner.com)
1616 points SwoopsFromAbove | 2 comments | | HN request time: 0.022s | source
Show context
mg ◴[] No.44568158[source]
In the 90s a friend told me about the internet. And that he knows someone who is in a university and has access to it and can show us. An hour later, we were sitting in front of a computer in that university and watched his friend surfing the web. Clicking on links, receiving pages of text. Faster than one could read. In a nice layout. Even with images. And links to other pages. We were shocked. No printing, no shipping, no waiting. This was the future. It was inevitable.

Yesterday I wanted to rewrite a program to use a large library that would have required me to dive deep down into the documentation or read its code to tackle my use case. As a first try, I just copy+pasted the whole library and my whole program into GPT 4.1 and told it to rewrite it using the library. It succeeded at the first attempt. The rewrite itself was small enough that I could read all code changes in 15 minutes and make a few stylistic changes. Done. Hours of time saved. This is the future. It is inevitable.

PS: Most replies seem to compare my experience to experiences that the responders have with agentic coding, where the developer is iteratively changing the code by chatting with an LLM. I am not doing that. I use a "One prompt one file. No code edits." approach, which I describe here:

https://www.gibney.org/prompt_coding

replies(58): >>44568182 #>>44568188 #>>44568190 #>>44568192 #>>44568320 #>>44568350 #>>44568360 #>>44568380 #>>44568449 #>>44568468 #>>44568473 #>>44568515 #>>44568537 #>>44568578 #>>44568699 #>>44568746 #>>44568760 #>>44568767 #>>44568791 #>>44568805 #>>44568823 #>>44568844 #>>44568871 #>>44568887 #>>44568901 #>>44568927 #>>44569007 #>>44569010 #>>44569128 #>>44569134 #>>44569145 #>>44569203 #>>44569303 #>>44569320 #>>44569347 #>>44569391 #>>44569396 #>>44569574 #>>44569581 #>>44569584 #>>44569621 #>>44569732 #>>44569761 #>>44569803 #>>44569903 #>>44570005 #>>44570024 #>>44570069 #>>44570120 #>>44570129 #>>44570365 #>>44570482 #>>44570537 #>>44570585 #>>44570642 #>>44570674 #>>44572113 #>>44574176 #
oblio ◴[] No.44568190[source]
The thing is: what is the steady state?

We kind of knew it for the internet and we basically figured it out early (even if we knew it was going to take a long time to happen due to generational inertia - see the death of newspapers).

For LLMs it looks a lot like deindustrialization. Aka pain and suffering for a lot of people.

replies(2): >>44568489 #>>44568617 #
ankit219 ◴[] No.44568489{3}[source]
i would disagree we kind of figured it out early. Early visions for internet were about things like information superhighway (with a centralized approach). What came to pass was the opposite. Its a good thing. There are lessons here in that we are not always accurate at predicting what the future would look like. But we can always identify trends that may shape the future.
replies(2): >>44568564 #>>44569468 #
Nevermark ◴[] No.44568564{4}[source]
The Internet was specifically designed to be maximally decentralized to be robust even to war.

The first web browser was designed to be completely peer to peer.

But you are right about getting it wrong. The peer to peer capabilities still exist, but a remarkable amount of what we now consider basic infrastructure is owned by very large centralized corporations. Despite long tails of hopeful or niche alternatives.

replies(2): >>44568802 #>>44569260 #
Karrot_Kream ◴[] No.44568802{5}[source]
> The Internet was specifically designed to be maximally decentralized to be robust even to war.

This is a bit naive. Until TLS, TCP traffic on down was sent in the clear. Most traffic used to be sent in the clear. This is what makes packet filtering and DPI possible. Moreover things like DNS Zones and IP address assignment are very centralized. There are cool projects out there that aim to be more decentralized internets, but unfortunately the original Internet was just not very good at being robust.

replies(1): >>44569453 #
1. degamad ◴[] No.44569453{6}[source]
It was robust against disruption, but it was not secure against attacks.

The threat model that was considered was bombs blowing up routers, but at the time, intermediaries intercepting traffic was not considered.

replies(1): >>44569861 #
2. skydhash ◴[] No.44569861[source]
I believe it was because they considered securing the physical apparatus. Are memo secured? Are books secured? At the small scale of the networks at that time, few things were worth securing.