←back to thread

507 points martinald | 2 comments | | HN request time: 0s | source
Show context
simonw ◴[] No.45054022[source]
https://www.axios.com/2025/08/15/sam-altman-gpt5-launch-chat... quotes Sam Altman saying:

> Most of what we're building out at this point is the inference [...] We're profitable on inference. If we didn't pay for training, we'd be a very profitable company.

replies(6): >>45054061 #>>45054069 #>>45054101 #>>45054102 #>>45054593 #>>45054858 #
drob518 ◴[] No.45054101[source]
Which is like saying, “If all we did is charge people money and didn’t have any COGS, we’d be a very profitable company.” That’s a truism of every business and therefore basically meaningless.
replies(3): >>45054218 #>>45054231 #>>45054405 #
dcre ◴[] No.45054231[source]
The Amodei quote in my other reply explains why this is wrong. The point is not to compare the training of the current model to inference on the current model. The thing that makes them lose so much money is that they are training the next model while making back their training cost on the current model. So it's not COGS at all.
replies(3): >>45054361 #>>45054385 #>>45055034 #
prasadjoglekar ◴[] No.45054361{3}[source]
Well, only if the one training model continued to function as a going business. Their amortization window for the training cost is 2 months or so. They can't just keep that up and collect $.

They have to build the next model, or else people will go to someone else.

replies(1): >>45055004 #
dcre ◴[] No.45055004{4}[source]
Why two months? It was almost a year between Claude 3.5 and 4. (Not sure how much it costs to go from 3.5 to 3.7.)
replies(2): >>45055861 #>>45055934 #
1. Jalad ◴[] No.45055861{5}[source]
Even being generous, and saying it's a year, most capital expenditures depreciate over a period of 5-7 years. To state the obvious, training one model a year is not a saving grace
replies(1): >>45055983 #
2. dcre ◴[] No.45055983[source]
I don't understand why the absolute time period matters — all that matters is that you get enough time making money on inference to make up for the cost of training.