←back to thread

224 points azhenley | 1 comments | | HN request time: 0.001s | source
Show context
mehulashah ◴[] No.45074995[source]
I do believe it’s time for systems folks to take a hard look at building systems abstractions on top of LLMs as they did 50 years ago on top of CPUs. LLMs are the new universal computing machines, but we build with them like we built computers in the 1950s - one computer at a time.
replies(6): >>45075043 #>>45075096 #>>45075104 #>>45075134 #>>45075169 #>>45075192 #
csmpltn ◴[] No.45075043[source]
This is the wrong take.

There's a point beyond which LLMs are an overkill, where a simple script or a "classic" program can outdo the LLM across speed, accuracy, scalability, price and more. LLMs aren't supposed to solve "universal computing". They are another tool in the toolbox, and it's all about using the right tool for the problem.

replies(1): >>45075057 #
mehulashah ◴[] No.45075057[source]
I shared your opinion for a while. But, that’s not what’s happening. People are using them for everything. When they do, expectations are set. Vendors will adjust and so will the rest of the industry. It’s happening.
replies(5): >>45075136 #>>45075186 #>>45075244 #>>45075250 #>>45075366 #
baby_souffle ◴[] No.45075186[source]
> People are using them for everything

Let’s see if that continues to be the case after some time. On a long enough timeline, that 100 line python script that is deterministic is going to beat the non deterministic llm.

They are a tool. They re not an omnitool.

replies(2): >>45075321 #>>45078056 #
1. IX-103 ◴[] No.45078056[source]
They're pretty close to being an omnitool. But like any tool you need to know how to use it.

Asking it to do some common task is probably silly when you should be asking it to write a program to do the common task (and add the program to its list of tools if it works).