1. Similar cost structure to electricity, but non-essential utility (currently)?
2. Like an operating system, but with non-determinism?
3. Like programming, but ...?
Where does the programming analogy break down?
1. Similar cost structure to electricity, but non-essential utility (currently)?
2. Like an operating system, but with non-determinism?
3. Like programming, but ...?
Where does the programming analogy break down?
The programming analogy is convenient but off. The joke has always been “the computer only does exactly what you tell it to do!” regarding logic bugs. Prompts and LLMs most certainly do not work like that.
I loved the parallels with modern LLMs and time sharing he presented though.
It quite literally works like that. The computer is now OS + user-land + LLM runner + ML architecture + weights + system prompt + user prompt.
Taken together, and since you're adding in probabilities (by using ML/LLMs), you're quite literally getting "the computer only does exactly what you tell it to do!", it's just that we have added "but make slight variations to what tokens you select next" (temperature>0.0) sometimes, but it's still the same thing.
Just like when you tell the computer to create encrypted content by using some seed. You're getting exactly what you asked for.