Most active commenters

    ←back to thread

    600 points antirez | 21 comments | | HN request time: 0.001s | source | bottom
    Show context
    dakiol ◴[] No.44625484[source]
    > Gemini 2.5 PRO | Claude Opus 4

    Whether it's vibe coding, agentic coding, or copy pasting from the web interface to your editor, it's still sad to see the normalization of private (i.e., paid) LLM models. I like the progress that LLMs introduce and I see them as a powerful tool, but I cannot understand how programmers (whether complete nobodies or popular figures) dont mind adding a strong dependency on a third party in order to keep programming. Programming used to be (and still is, to a large extent) an activity that can be done with open and free tools. I am afraid that in a few years, that will no longer be possible (as in most programmers will be so tied to a paid LLM, that not using them would be like not using an IDE or vim nowadays), since everyone is using private LLMs. The excuse "but you earn six figures, what' $200/month to you?" doesn't really capture the issue here.

    replies(46): >>44625521 #>>44625545 #>>44625564 #>>44625827 #>>44625858 #>>44625864 #>>44625902 #>>44625949 #>>44626014 #>>44626067 #>>44626198 #>>44626312 #>>44626378 #>>44626479 #>>44626511 #>>44626543 #>>44626556 #>>44626981 #>>44627197 #>>44627415 #>>44627574 #>>44627684 #>>44627879 #>>44628044 #>>44628982 #>>44629019 #>>44629132 #>>44629916 #>>44630173 #>>44630178 #>>44630270 #>>44630351 #>>44630576 #>>44630808 #>>44630939 #>>44631290 #>>44632110 #>>44632489 #>>44632790 #>>44632809 #>>44633267 #>>44633559 #>>44633756 #>>44634841 #>>44635028 #>>44636374 #
    1. ozgung ◴[] No.44626378[source]
    > The excuse "but you earn six figures, what' $200/month to you?" doesn't really capture the issue here.

    Just like every other subscription model, including the one in the Black Mirror episode, Common People. The value is too good to be true for the price at the beginning. But you become their prisoner in the long run, with increasing prices and degrading quality.

    replies(3): >>44626418 #>>44630789 #>>44633302 #
    2. lencastre ◴[] No.44626418[source]
    Can you expand on your argument?
    replies(6): >>44626510 #>>44626777 #>>44626945 #>>44626948 #>>44627096 #>>44627412 #
    3. jordanbeiber ◴[] No.44626510[source]
    The argument is perhaps ”enshittification”, and that becoming reliant on a specific provider or even set of providers for ”important thing” will become problematic over time.
    4. x______________ ◴[] No.44626777[source]
    Not op but a something from a few days ago that might be interesting for you:

      259. Anthropic tightens usage limits for Claude Code without telling users (techcrunch.com)
     395 points by mfiguiere 2 days ago | hide | 249 comments
    
    https://news.ycombinator.com/item?id=44598254
    5. nico ◴[] No.44626945[source]
    Currently in the front page of HN: https://news.ycombinator.com/item?id=44622953

    It isn’t specific to software/subscriptions but there are plenty of examples of quality degradation in the comments

    6. signa11 ◴[] No.44626948[source]
    enshittification/vendor-lockin/stickiness/… take your pick
    7. nicce ◴[] No.44627096[source]
    There is a reason why companies throw billions into AI and still are not profitable. They must be the first ones to hook the users in the long run and make service necessary part of user’s life. And then increase the price.
    replies(1): >>44628048 #
    8. majormajor ◴[] No.44627412[source]
    I don't think it's subscriptions so much as consumer startup pricing strategies:

    Netflix/Hulu were "losing money on streaming"-level cheap.

    Uber was "losing money on rides"-level cheap.

    WeWork was "losing money on real-estate" level cheap.

    Until someone releases wildly profitable LLM company financials it's reasonable to expect prices to go up in the future.

    Course, advances in compute are much more reasonable to expect than advances in cheap media production, taxi driver availability, or office space. So there's a possibility it could be different. But that might require capabilities to hit a hard plateau so that the compute can keep up. And that might make it hard to justify the valuations some of these companies have... which could also lead to price hikes.

    But I'm not as worried as others. None of these have lock-in. If the prices go up, I'm happy to cancel or stop using it.

    For a current student or new grad who has only ever used the LLM tools, this could be a rougher transition...

    Another thing that would change the calculation is if it becomes impossible to maintain large production-level systems competitively without these tools. That's presumably one of the things the companies are betting on. We'll see if they get there. At that point many of us probably have far bigger things to worry about.

    replies(2): >>44628124 #>>44628540 #
    9. mleo ◴[] No.44628048{3}[source]
    Or expect price to deliver the service becomes cheaper. Or both.
    10. bee_rider ◴[] No.44628124{3}[source]
    It isn’t even that unreasonable for the AI companies to not be profitable at the moment (they are probably betting they can decrease costs before they run out of money, and want to offer people something like what the final experience will be). But it’s totally bizarre that people are comparing the cost of running locally to the current investor-subsidized remote costs.

    Eventually, these things should get closer. Eventually the hosted solutions have to make money. Then we’ll see if the costs of securing everything and paying some tech company CEO’s wage are higher than the benefits of centrally locating the inference machines. I expect local running will win, but the future is a mystery.

    replies(1): >>44630507 #
    11. klabb3 ◴[] No.44628540{3}[source]
    > But I'm not as worried as others. None of these have lock-in.

    They will. And when they do it will hit hard, especially if you’re not just a consumer but relying on it for work.

    One vector is personalization. Your LLM gets to know you and your history. They will not release that to a different company.

    Another is integrations. Perhaps you’re using LLMs for assistance, but only Gemini has access to your calendar.

    Cloud used to be ”rent a server”. You could do it anywhere, but AWS was good & cheap. Now how is is it to migrate? Can you even afford the egress? How easy is it to combine offerings from different cloud providers?

    12. andyferris ◴[] No.44630507{4}[source]
    I think it’s the time slice problem.

    Locally I need to pay for my GPU hardware 24x7. Some electricity but mostly going to be hardware cost at my scale (plus I have excess free energy to burn).

    Remotely I probably use less than an hour of compute a day. And only workdays.

    Combined with batching being computationally more efficient it’s hard to see anything other than local inference ALWAYS being 10x more expensive than data centre inference.

    (Would hope and love to be proven wrong about this as it plays out - but that’s the way I see it now).

    13. Aurornis ◴[] No.44630789[source]
    I don’t get it. There are multiple providers. I cancel one provider and sign up for someone new in a few minutes when I feel like changing. I’ve been doing this every few months.

    I think the only people worried about lock-in or Black Mirror themes are the people who are thinking about these subscriptions in an abstract sense.

    It’s really easy to change providers. They’re all improving. Competition is intense.

    replies(2): >>44631113 #>>44634717 #
    14. dbingham ◴[] No.44631113[source]
    The same, in theory, applies to social media. But they've all enshittified in very similar ways now that they've captured their audiences. In theory there is intense competition between Meta, Twitter, TikTok, etc, but in actuality the same market forces drive the same enshittification across all of those platforms. They have convergent interests. If they all force more ads and suggested posts on you, they all make more money and you have no where to go.

    People are reasonably worried that the same will happen to AI.

    replies(1): >>44632421 #
    15. senko ◴[] No.44632421{3}[source]
    > The same, in theory, applies to social media.

    It absolutely does not.

    Your use of social network derives value from your network. If you switch, you have to convince everyone else to switch as well.

    It's a tremendous barrier to switching.

    LLMs are for the most part interchangeable commodity.

    replies(2): >>44634795 #>>44654964 #
    16. PeterStuer ◴[] No.44633302[source]
    There is that chance. In other instances commoditization occurs before market consolidation.

    As of now, specifically for coding assistance LLM, workflows remain generic enough to have relatively low switching costs between the different models.

    17. darkoob12 ◴[] No.44634717[source]
    In the early days of the Web, competition was intens in Search Engine market but eventually one of them won the competition and became the only viable option. I expect this will happen to AI as well. In future only one AI company will dominate the market and people will have no choice but to use it.
    replies(1): >>44635260 #
    18. smokel ◴[] No.44634795{4}[source]
    Note that the comment you are replying to is speculating about the (not so distant) future. Be assured that companies will try their best to lock customers in.

    One option is to add adverts to the generated output, and making the product cheaper than the competition. Another is to have all your cloud data preprocessed with LLMs in a non-portable way, so that changing will incur a huge cost.

    19. rwallace ◴[] No.44635260{3}[source]
    I'm seeing quite a few people on this site recently talking about their Kagi subscriptions, claiming it is sufficiently better than Google to be worth the money.
    replies(1): >>44654954 #
    20. abid786 ◴[] No.44654954{4}[source]
    This site is not all representative of the average internet user though
    21. abid786 ◴[] No.44654964{4}[source]
    More and more of the social networks are just the algorithm though - tiktok, X, Facebook, etc. How much of your feed does the average use personally know now?