←back to thread

507 points martinald | 3 comments | | HN request time: 0.204s | source
Show context
_sword ◴[] No.45055003[source]
I've done the modeling on this a few times and I always get to a place where inference can run at 50%+ gross margins, depending mostly on GPU depreciation and how good the host is at optimizing utilization. The challenge for the margins is whether or not you consider model training costs as part of the calculation. If model training isn't capitalized + amortized, margins are great. If they are amortized and need to be considered... yikes
replies(7): >>45055030 #>>45055275 #>>45055536 #>>45055820 #>>45055835 #>>45056242 #>>45056523 #
trilogic ◴[] No.45055536[source]
I have to disagree. The biggest cost is still energy consumption, water and maintenance. Not to mention, to keep up with the rivals in incredibly high tempo (so offering billions like Meta recently). Then the cost of hardware that is equal to Nvidia skyrocketing shares :) No one should dare to talk about profit yet. Now is time to grab the market, invest a lot and work hard, hopping for a future profit. The equation is still work on progress.
replies(3): >>45055568 #>>45055976 #>>45058036 #
1. wtallis ◴[] No.45055568[source]
> The biggest cost is still energy consumption, water and maintenance.

Are you saying that the operating costs for inference exceed the costs of training?

replies(2): >>45055713 #>>45056170 #
2. umpalumpaaa ◴[] No.45055713[source]
No. But training an LLM is certainly very very expensive and a gamble every time you do it. I think of it a bit like a pharmaceutical company doing vaccine research…
3. trilogic ◴[] No.45056170[source]
The global cost of inference in both Openai and Anthropic it exceed training cost for sure. The reason is simple: the inference cost grows with requests not with datasets. My math simplified by AI says: Suppose training GPT-like model costs

= $ 10,000,000 C T

=$10,000,000.

Each query costs

= $ 0.002 C I

=$0.002.

Break-even:

> 10,000,000 0.002 = 5,000,000,000

inferences N> 0.002 10,000,000

=5,000,000,000inferences

So after 5 billion queries, inference costs surpass the training cost.

Openai claims it has 100 million users x queries = I let you judge.