←back to thread

507 points martinald | 2 comments | | HN request time: 0s | source
Show context
simonw ◴[] No.45054022[source]
https://www.axios.com/2025/08/15/sam-altman-gpt5-launch-chat... quotes Sam Altman saying:

> Most of what we're building out at this point is the inference [...] We're profitable on inference. If we didn't pay for training, we'd be a very profitable company.

replies(6): >>45054061 #>>45054069 #>>45054101 #>>45054102 #>>45054593 #>>45054858 #
ugh123 ◴[] No.45054069[source]
That might be the case, but inference times have only gone up since GPT-3 (GPT-5 is regularly 20+ seconds for me).
replies(1): >>45054137 #
1. asabla ◴[] No.45054137[source]
And by GPT-5 you mean through their API? Directly through Azure OpenAI services? or are you talking about ChatGPT set to using GPT-5.

All of these alternatives means different things when you say it takes +20 seconds for a full response.

replies(1): >>45054301 #
2. ugh123 ◴[] No.45054301[source]
Sure, apologies. I mean ChatGPT UI