←back to thread

507 points martinald | 2 comments | | HN request time: 0.492s | source
Show context
gitremote ◴[] No.45052117[source]
These numbers are off.

> $20/month ChatGPT Pro user: Heavy daily usage but token-limited

ChatGPT Pro is $200/month and Sam Altman already admitted that OpenAI is losing money from Pro subscriptions in January 2025:

"insane thing: we are currently losing money on openai pro subscriptions!

people use it much more than we expected."

- Sam Altman, January 6, 2025

https://xcancel.com/sama/status/1876104315296968813

replies(9): >>45052594 #>>45052618 #>>45053078 #>>45053278 #>>45053311 #>>45053620 #>>45053859 #>>45055188 #>>45055732 #
Topfi ◴[] No.45053078[source]
That doesn't seem compatible with what he stated more recently:

> We're profitable on inference. If we didn't pay for training, we'd be a very profitable company.

Source: https://www.axios.com/2025/08/15/sam-altman-gpt5-launch-chat...

His possible incentives and the fact OpenAI isn't a public company simply make it hard for us to gauge which of these statements is closer to the truth.

replies(4): >>45053121 #>>45053373 #>>45053429 #>>45053432 #
1. re-thc ◴[] No.45053121[source]
> That doesn't seem compatible with what he stated more recently:

Profitable on inference doesn't mean they aren't losing money on pro plans. What's not compatible?

The API requests are likely making more money.

replies(1): >>45053561 #
2. gitremote ◴[] No.45053561[source]
Yes, API pricing is usage based, but ChatGPT Pro pricing is a flat rate for a time period.

The question is then whether SaaS companies paying for GPT API pricing are profitable if they charge their users a flat rate for a time period. If their users trigger inference too much, they would also lose money.