←back to thread

600 points antirez | 3 comments | | HN request time: 0s | source
Show context
dakiol ◴[] No.44625484[source]
> Gemini 2.5 PRO | Claude Opus 4

Whether it's vibe coding, agentic coding, or copy pasting from the web interface to your editor, it's still sad to see the normalization of private (i.e., paid) LLM models. I like the progress that LLMs introduce and I see them as a powerful tool, but I cannot understand how programmers (whether complete nobodies or popular figures) dont mind adding a strong dependency on a third party in order to keep programming. Programming used to be (and still is, to a large extent) an activity that can be done with open and free tools. I am afraid that in a few years, that will no longer be possible (as in most programmers will be so tied to a paid LLM, that not using them would be like not using an IDE or vim nowadays), since everyone is using private LLMs. The excuse "but you earn six figures, what' $200/month to you?" doesn't really capture the issue here.

replies(46): >>44625521 #>>44625545 #>>44625564 #>>44625827 #>>44625858 #>>44625864 #>>44625902 #>>44625949 #>>44626014 #>>44626067 #>>44626198 #>>44626312 #>>44626378 #>>44626479 #>>44626511 #>>44626543 #>>44626556 #>>44626981 #>>44627197 #>>44627415 #>>44627574 #>>44627684 #>>44627879 #>>44628044 #>>44628982 #>>44629019 #>>44629132 #>>44629916 #>>44630173 #>>44630178 #>>44630270 #>>44630351 #>>44630576 #>>44630808 #>>44630939 #>>44631290 #>>44632110 #>>44632489 #>>44632790 #>>44632809 #>>44633267 #>>44633559 #>>44633756 #>>44634841 #>>44635028 #>>44636374 #
1. overgard ◴[] No.44629019[source]
I think it's an unlikely future.

What I think is more likely is people will realize that every line of code written is, to an extent, a liability, and generating massive amounts of sloppy insecure poorly performing code is a massive liability.

That's not to say that AI's will go away, obviously, but I think when the hype dies down and people get more accustomed to what these things can and can't do well we'll have a more nuanced view of where these things should be applied.

I suppose what's still not obvious to me is what happens if the investment money dries up. OpenAI and Anthropic, as far as I know, aren't anywhere near profitable and they require record breaking amounts of capital to come in just to sustain what they have. If what we currently see is the limit of what LLM's and other generative techniques can do, then I can't see that capital seeing a good return on its investment. If that's the case, I wonder if when the bubble bursts these things become massively more expensive to use, or get taken out of products entirely. (I won't be sad to see all the invasive Copilot buttons disappear..)

replies(1): >>44629416 #
2. kossae ◴[] No.44629416[source]
The point on investment is apt. Even if they achieve twice as much as they’re able to today (some doubts amongst experts here), when the VC funding dries up we’ve seen what happens. It’s time to pay the piper. The prices rise to Enterprise-plan amounts, and companies start making much more real ROI decisions on these tools past the hype bubble. Will be interesting to see how that angle plays out. I’m no denier nor booster, but in the capitalist society these things inevitably balance out.
replies(1): >>44633490 #
3. MoreQARespect ◴[] No.44633490[source]
The same thing happened with the first internet bubble. It didnt prevent the rise of the internet it just meant some players who, for instance, overinvested in infrastructure ended up taking an L while other players bought up their overbuilt assets for a song and capitalized upon it later.