←back to thread

LLM Inevitabilism

(tomrenner.com)
1612 points SwoopsFromAbove | 3 comments | | HN request time: 0.497s | source
Show context
lsy ◴[] No.44568114[source]
I think two things can be true simultaneously:

1. LLMs are a new technology and it's hard to put the genie back in the bottle with that. It's difficult to imagine a future where they don't continue to exist in some form, with all the timesaving benefits and social issues that come with them.

2. Almost three years in, companies investing in LLMs have not yet discovered a business model that justifies the massive expenditure of training and hosting them, the majority of consumer usage is at the free tier, the industry is seeing the first signs of pulling back investments, and model capabilities are plateauing at a level where most people agree that the output is trite and unpleasant to consume.

There are many technologies that have seemed inevitable and seen retreats under the lack of commensurate business return (the supersonic jetliner), and several that seemed poised to displace both old tech and labor but have settled into specific use cases (the microwave oven). Given the lack of a sufficiently profitable business model, it feels as likely as not that LLMs settle somewhere a little less remarkable, and hopefully less annoying, than today's almost universally disliked attempts to cram it everywhere.

replies(26): >>44568145 #>>44568416 #>>44568799 #>>44569151 #>>44569734 #>>44570520 #>>44570663 #>>44570711 #>>44570870 #>>44571050 #>>44571189 #>>44571513 #>>44571570 #>>44572142 #>>44572326 #>>44572360 #>>44572627 #>>44572898 #>>44573137 #>>44573370 #>>44573406 #>>44574774 #>>44575820 #>>44577486 #>>44577751 #>>44577911 #
fendy3002 ◴[] No.44568145[source]
LLMs need significant optimization or we get significant improvement on computing power while keeping the energy cost the same. It's similar with smartphone, when at the start it's not feasible because of computing power, and now we have one that can rival 2000s notebooks.

LLMs is too trivial to be expensive

EDIT: I presented the statement wrongly. What I mean is the use case for LLM are trivial things, it shouldn't be expensive to operate

replies(4): >>44568305 #>>44568319 #>>44568610 #>>44570702 #
killerstorm ◴[] No.44568610[source]
LLM can give you thousands of lines of perfectly working code for less than 1 dollar. How is that trivial or expensive?
replies(3): >>44568731 #>>44568734 #>>44569075 #
sgt101 ◴[] No.44569075[source]
Looking up a project on github, downloading it and using it can give you 10000 lines of perfectly working code for free.

Also, when I use Cursor I have to watch it like a hawk or it deletes random bits of code that are needed or adds in extra code to repair imaginary issues. A good example was that I used it to write a function that inverted the axis on some data that I wanted to present differently, and then added that call into one of the functions generating the data I needed.

Of course, somewhere in the pipeline it added the call into every data generating function. Cue a very confused 20 minutes a week later when I was re-running some experiments.

replies(1): >>44569459 #
brulard ◴[] No.44569459[source]
Are you seriously comparing downloading static code from github with bespoke code generated for your specific problem? LLMs don't keep you from coding, they assist it. Sometimes the output works, sometimes it doesn't (on first or multiple tries). Dismissing the entire approach because it's not perfect yet is shortsighted.
replies(1): >>44569848 #
1. ozgrakkurt ◴[] No.44569848[source]
They didn’t dismiss it, they just said it is not really that useful which is correct?
replies(2): >>44572257 #>>44575667 #
2. Matticus_Rex ◴[] No.44572257[source]
Many obviously disagree that it's correct
3. brulard ◴[] No.44575667[source]
Obviously YMMV, but it is extremely useful for me and for many people out there.