There's a point beyond which LLMs are an overkill, where a simple script or a "classic" program can outdo the LLM across speed, accuracy, scalability, price and more. LLMs aren't supposed to solve "universal computing". They are another tool in the toolbox, and it's all about using the right tool for the problem.
Let’s see if that continues to be the case after some time. On a long enough timeline, that 100 line python script that is deterministic is going to beat the non deterministic llm.
They are a tool. They re not an omnitool.
The problem is not the tool, the problem are the people selling the hype, the people accepting the hype as it is and this "everyone" person who I keep on listening about, I wonder why his takes are often times wrong but he doesn't accept them...
People can be wrong, People are wrong in a lot of contexts, I don't think the world is efficient in this sense. We are emotional beings, sell us hype and we would accept it and run with it.
Agree on how it is not an omnitool. Why are we going towards an inferior product (AI for VMS??) when this doesn't make sense when we already have superior product (I think)
I guess somethings don't make sense and if everyone jumps in a well, most people would follow it (maybe including myself, if I didn't want to be contrarian in such sense, my knowledge is really limited regarding AI so all of the above statements can be wrong and hey I am wrong I usually am)
Yeah and when I was in college, StackOverflow was full of questions like “how do I add 2 numbers with jQuery”. This is normal. The newbies don’t know what they’re doing and with time they will get enough hard knocks to learn. We’ve all gone through this and still are in areas that are new to us.
LLMs aren’t gonna solve the fundamentals: Seniors still gotta senior and newbies still gotta learn.
Now, most arithmetic is done by computers, spreadsheets are done by computers, and almost all assembly language is written by computers. People have moved on to higher level programming languages for communicating ideas to computers. Would it really be that surprising to learn that in the future people use natural language to speak with computers?
The company I work for went through a big AI push about a month ago. Before then, almost no code was written by AI. Now, it's the majority of code. I'm not saying that AI coders will be able to replace people, because the AI honestly is just not as capable - if it were interviewing for my company, I would not hire it. But the thing is that I can go from design doc to prototype using less than a day and $10 of tokens. Sure I have to make corrections and rewrite some of the more fiddly bits, but it saves me a ton of typing.
And LLMs are not limited to programming. Any computer-based task that you would allow an unpaid intern to complete for you would be a reasonable fit for AI.
Asking it to do some common task is probably silly when you should be asking it to write a program to do the common task (and add the program to its list of tools if it works).