As the years go by, I'm finding myself being able to rely on those less and less, because every time I do, I eventually get disappointed by them working against their user base.
I don't think we should be so quick to dismiss the holes LLMs are fulfilling as unnecessary. The only thing "necessary" is food water and shelter by some measures.
For me this been a pretty fundamental shift, where before I either had to figure out another way so I can move on, or had to spend weeks writing one function after learning the needed math, and now it can take me 10-30 minutes to nail perfectly.
The solution is to break up monopolies....
seriously, the idea we need this is a joke. people need it to pretend they can do their job. the rest of us enjoy having quick help from it. and we have done without it for a very long time already..
That I don't know, and probably no one else, way too early to tell. I only responded to a comment stating "LLMs aren't a fundamental or core technology, they're an amusing party trick", which obviously I disagree with as for me they've been a fundamental shift in what I'm able to do.
> From the energy consumption impact on climate change alone I would say the answer is clearly no.
Ok, I guess that's fair enough. So if someone happens to use local models at home, in a home that is powered by solar power, then you'd feel LLM starting to be a net positive for humanity?
> And tons of other issues like how Big Tech is behaving.
This is such a big thing in general (that I agree with) but it has nothing to do with LLMs as a technology. Big Tech acts like they own the world and can do whatever they want with it, regardless if there are LLMs or not, so not sure why anyone would expect anything else.
An LLM can give you a hazy picture, but it's your job to focus it.
Just think about the type of code these things are trained on and the fact you’re clearly some random non-specialist.
I can understand becoming reliant on a technology -- I expect most programmers today would be pretty lost with punch cards or line editors -- but LLM coding seems too new for true reliance to have formed yet...?
That's just a misunderstanding, I'm not "vibing" anything. The tests are written by me, the API interfaces are written by me, the usages are written by me, and the implementation of these functions are written by an LLM, but reviewed to be up to code standards/quality by me.
If a function gives me the right output for the inputs I have in mind, does anything beyond that really matter?
Sure, that would make a difference, but it's not gonna happen anytime soon, other than hacker hobbyists, because no one is making money off of that.
> This is such a big thing in general (that I agree with) but it has nothing to do with LLMs as a technology.
Correct -- I don't have any issue with the technology itself, but rather how the technology is implemented and used, and the resources put towards its use. And BigTech are putting hundreds of $B into this -- for what end exactly besides potentially making tons of money off of consumer subscribers or ads a-la-Meta or Google? If BigTech was putting the same amount of money into technology that could actually benefit humanity (you know, like actually saving the world from potential future destruction by climate change), I'd have a much kinder view of them.