←back to thread

113 points sethkim | 8 comments | | HN request time: 0.869s | source | bottom
1. guluarte ◴[] No.44457853[source]
they are doing the we work approach, gain customers at all costs even if that means losing money.
replies(2): >>44457901 #>>44466539 #
2. FirmwareBurner ◴[] No.44457901[source]
Aren't all LLMs loosing money at this point?
replies(2): >>44457920 #>>44457926 #
3. simonw ◴[] No.44457920[source]
I don't believe that's true on inference - I think most if not all of the major providers are selling inference at a (likely very small) margin over what it costs to serve them (hardware + energy).

They likely lose money when you take into account the capital cost of training the model itself, but that cost is at least fixed: once it's trained you can serve traffic from it for as long as you chose to keep the model running in production.

replies(2): >>44458459 #>>44466189 #
4. throwawayoldie ◴[] No.44457926[source]
Yes, and the obvious endgame is wait until most software development is effectively outsourced to them, then jack the prices to whatever they want. The Uber model.
replies(1): >>44459444 #
5. bungalowmunch ◴[] No.44458459{3}[source]
yes I would generally agree; although I don't have a have source for this, I've heard whispers of Anthropic running at a much higher margin compared to the other labs
6. FirmwareBurner ◴[] No.44459444{3}[source]
Good thing AI can't replace my drinking during work time skills
7. guluarte ◴[] No.44466189{3}[source]
Some companies like Google, Facebook, Microsoft, and OpenAI are definitely losing money providing free inference to millions of users daily. Companies where most users are using their API, like Anthropic, are probably seeing good margins since most of their users are paying users.
8. ethanpailes ◴[] No.44466539[source]
TPUs do give Google a unique structural advantage on inference cost though.