←back to thread

321 points jhunter1016 | 6 comments | | HN request time: 0.001s | source | bottom
Show context
Roark66 ◴[] No.41878594[source]
>OpenAI plans to loose $5 billion this year

Let that sink in for anyone that has incorporated Chatgpt in their work routines to the point their normal skills start to atrophy. Imagine in 2 years time OpenAI goes bust and MS gets all the IP. Now you can't really do your work without ChatGPT, but it cost has been brought up to how much it really costs to run. Maybe $2k per month per person? And you get about 1h of use per day for the money too...

I've been saying for ages, being a luditite and abstaining from using AI is not the answer (no one is tiling the fields with oxen anymore either). But it is crucial to at the very least retain 50% of capability hosted models like Chatgpt offer locally.

replies(20): >>41878631 #>>41878635 #>>41878683 #>>41878699 #>>41878717 #>>41878719 #>>41878725 #>>41878727 #>>41878813 #>>41878824 #>>41878984 #>>41880860 #>>41880934 #>>41881556 #>>41881938 #>>41882059 #>>41883046 #>>41883088 #>>41883171 #>>41885425 #
sebzim4500 ◴[] No.41878719[source]
The marginal cost of inference per token is lower than what OpenAI charges you (IIRC about 2x cheaper), they make a loss because of the enormous costs of R&D and training new models.
replies(4): >>41878823 #>>41878875 #>>41878927 #>>41879029 #
1. tempusalaria ◴[] No.41878927[source]
It’s not clear this is true because reported numbers don’t disaggregate paid subscription revenue (certainly massively GP positive) vs free usage (certainly negative) vs API revenue (probably GP negative).

Most of their revenue is the subscription stuff, which makes it highly likely they lose money per token on the api (not surprising as they are are in price war with Google et al)

If you have an enterprise ChatGPT sub you have to consume around 5mln tokens a month to match the cost of using the api on GPT4o. At 100 words per minute that’s 35 days on continuous typing which shows how ridiculous the costs of api vs subscription are.

replies(1): >>41881150 #
2. seizethecheese ◴[] No.41881150[source]
In summary, the original point of this thread is wrong. There’s essentially no future where these tools disappear or become unavailable at reasonable cost for consumers. Much more likely is they get way better.
replies(2): >>41883125 #>>41884310 #
3. jazzyjackson ◴[] No.41883125[source]
I mean use to be I could get an Uber across Manhattan for $5

From my view chatbots are still in the "selling dollars for 90 cents" category of product, of course it sells like discounted hotcakes...

replies(2): >>41883329 #>>41887013 #
4. seizethecheese ◴[] No.41883329{3}[source]
… this is conflating two things, marginal and average cost/revenue. They are very very different.
5. tempusalaria ◴[] No.41884310[source]
Definitely they will.

OpenAI’s potential issue is that if Google offers tokens at a 10% gross margin, OpenAI won’t be able to offer api tokens at a positive gross margin at all. Their only chance really is building a big subscription business. No way they can compete with a hyperscaler on api cost long run

6. sebzim4500 ◴[] No.41887013{3}[source]
The difference is that Uber was making a loss on those journeys whereas OpenAI aren't making a loss on chatgpt subscriptions.

They make a loss overall because they spend a ton on R&D.