←back to thread

321 points jhunter1016 | 4 comments | | HN request time: 0s | source
Show context
Roark66 ◴[] No.41878594[source]
>OpenAI plans to loose $5 billion this year

Let that sink in for anyone that has incorporated Chatgpt in their work routines to the point their normal skills start to atrophy. Imagine in 2 years time OpenAI goes bust and MS gets all the IP. Now you can't really do your work without ChatGPT, but it cost has been brought up to how much it really costs to run. Maybe $2k per month per person? And you get about 1h of use per day for the money too...

I've been saying for ages, being a luditite and abstaining from using AI is not the answer (no one is tiling the fields with oxen anymore either). But it is crucial to at the very least retain 50% of capability hosted models like Chatgpt offer locally.

replies(20): >>41878631 #>>41878635 #>>41878683 #>>41878699 #>>41878717 #>>41878719 #>>41878725 #>>41878727 #>>41878813 #>>41878824 #>>41878984 #>>41880860 #>>41880934 #>>41881556 #>>41881938 #>>41882059 #>>41883046 #>>41883088 #>>41883171 #>>41885425 #
hmottestad ◴[] No.41878683[source]
Cost tends to go down with time as compute becomes cheaper. And as long as there is competition in the AI space it's likely that other companies would step in and fill the void created by OpenAI going belly up.
replies(2): >>41878721 #>>41878929 #
ToucanLoucan ◴[] No.41878929[source]
> Cost tends to go down with time as compute becomes cheaper.

This is generally true but seems to be, if anything, inverted for AI. These models cost billions to train in compute, and OpenAI thus far has needed to put out a brand new one roughly annually in order to stay relevant. This would be akin to Apple putting out a new iPhone that costed billions to engineer year over year, but was giving the things away for free on the corner and only asking for money for the versions with more storage and what have you.

The vast majority of AI adjacent companies too are just repackaging OpenAI's LLMs, the exceptions being ones like Meta, which certainly has a more solid basis what with being tied to an incredibly profitable product in Facebook, but also... it's Meta and I'm sure as shit not using their AI for anything, because it's Meta.

I did some back of napkin math in a comment a ways back and landed on that in order to break even merely on training costs, not including the rest of the expenditure of the company, they would need to charge all of their current subscribers $150 per month, up from... I think the most expensive right now is about $20? So nearly an 8 fold price increase, with no attrition, to again break even. And I'm guessing all these investors they've had are not interested in a 0 sum.

replies(2): >>41881018 #>>41881375 #
1. Mistletoe ◴[] No.41881018[source]
The closest analog seems to be bitcoin mining, which continually increases difficulty. And if you've ever researched how many bitcoin miners go under...
replies(1): >>41881146 #
2. lukeschlather ◴[] No.41881146[source]
It's nothing like bitcoin mining. Bitcoin mining is intentionally designed so that it gets harder as people use it more, no matter what.

With LLMs, if you have a use case which can run on an H100 or whatever and costs $4/hour, and the LLM has acceptable performance, it's going to be cheaper in a couple years.

Now, all these companies are improving their models but they're doing that in search of magical new applications the $4/hour model I'm using today can't do. If the $4/hour model works today, you don't have to worry about the cost going up. It will work at the same price or cheaper in the future.

replies(1): >>41881381 #
3. Mistletoe ◴[] No.41881381[source]
But OpenAI has to keep releasing new ever-increasing models to justify it all. There is a reason they are talking about nuclear reactors and Sam needing 7 trillion dollars.

One other difference from Bitcoin is that the price of Bitcoin rises to make it all worth it, but we have the opposite expectation with AI where users will eventually need to pay much more than now to use it, but people only use it now because it is free or heavily subsidized. I agree that current models are pretty good and the price of those may go down with time but that should be even more concerning to OpenAI.

replies(1): >>41882295 #
4. kergonath ◴[] No.41882295{3}[source]
> But OpenAI has to keep releasing new ever-increasing models to justify it all.

There seems to be some renewed interest for smaller, possibly better-designed LLMs. I don’t know if this really lowers training costs, but it makes inference cheaper. I suspect at some point we’ll have clusters of smaller models, possibly activated when needed like in MoE LLMs, rather than ever-increasing humongous models with 3T parameters.