←back to thread

507 points martinald | 1 comments | | HN request time: 0.212s | source
Show context
simonw ◴[] No.45054022[source]
https://www.axios.com/2025/08/15/sam-altman-gpt5-launch-chat... quotes Sam Altman saying:

> Most of what we're building out at this point is the inference [...] We're profitable on inference. If we didn't pay for training, we'd be a very profitable company.

replies(6): >>45054061 #>>45054069 #>>45054101 #>>45054102 #>>45054593 #>>45054858 #
drob518 ◴[] No.45054101[source]
Which is like saying, “If all we did is charge people money and didn’t have any COGS, we’d be a very profitable company.” That’s a truism of every business and therefore basically meaningless.
replies(3): >>45054218 #>>45054231 #>>45054405 #
gomox ◴[] No.45054405[source]
I can't imagine the hoops an accountant would have to go through to argue training cost is COGS. In the most obvious stick-figures-for-beginners interpretation, as in, "If I had to explain how a P&L statement works to an AI engineer", training is R&D cost and inference cost is COGS.
replies(2): >>45054450 #>>45055088 #
drob518 ◴[] No.45055088[source]
I wasn’t using COGS in a GAAP sense, but rather as a synonym for unspecified “costs.” My bad. I suppose you would classify training as development and ongoing datacenter and GPU costs as actual GAAP COGS. My point was, if all you focus on is revenue and ignore the costs of creating your business and keeping it running, it’s pretty easy for any business to be “profitable.”
replies(2): >>45055468 #>>45067956 #
DenisM ◴[] No.45055468[source]
It’s generally useful to consider unit economy separate from whole company. If your unit economy is negative thing are very bleak. If it’s positive, your chance are going up by a lot - scaling the business amortizes fixed (non-unit) costs, such as admin and R&D, and slightly improves unit margins as well.

However this does not work as well if your fixed (non-unit) cost is growing exponentially. You can’t get out of this unless your user base grows exponentially or the customer value (and price) per user grows exponentially.

I think this is what Altman is saying - this is an unusual situation: unit economy is positive but fixed costs are exploding faster than economy if scale can absorb it.

You can say it’s splitting hair, but insightful perspective often requires teasing things apart.

replies(1): >>45055996 #
drob518 ◴[] No.45055996[source]
It’s splitting a hair, but a pretty important hair. Does anyone think that models won’t need continuous retraining? Does anyone think models won’t continue to try to scale? Personally, I think we’re reaching diminishing returns with scaling, which is probably good because we’ve basically run out of content to train on, and so perhaps that does stop or at least slow down drastically. But I don’t see a scenario where constant retraining isn’t the norm, even if the rough amount of content we’re using for it grows only slightly.
replies(1): >>45067944 #
gomox ◴[] No.45067944[source]
Well, models are definitely good enough for some things in their current state, without needing to be retrained (computer translation for example was a solved problem with GPT3)
replies(1): >>45070859 #
1. drob518 ◴[] No.45070859[source]
That’s true but irrelevant. No AI company is stopping training and further model development. OpenAI didn’t stop with GPT3, and they won’t stop with GPT5. No company, AI company or not, stops innovating in their market segment. You need to keep innovating to stay competitive.