←back to thread

197 points baylearn | 7 comments | | HN request time: 0s | source | bottom
1. JunkDNA ◴[] No.44473050[source]
I keep seeing this charge that AI companies have an “Uber problem” meaning the business is heavily subsidized by VC. Is there any analysis that has been done that explains how this breaks down (training vs inference and what current pricing is)? At least with Uber you had a cab fare as a benchmark. But what should, for example, ChatGPT actually cost me per month without the VC subsidy? How far off are we?
replies(2): >>44473084 #>>44474853 #
2. cratermoon ◴[] No.44473084[source]
https://www.wheresyoured.at/openai-is-a-systemic-risk-to-the...
replies(1): >>44473244 #
3. JunkDNA ◴[] No.44473244[source]
This article isn’t particularly helpful. It focuses on a ton of specific OpenAI business decisions that aren’t necessarily generalizable to the rest of the industry. OpenAI itself might be out over its skis, but what I’m asking about is the meta-accusation that AI in general is heavily subsidized. When the music stops, what does the price of AI look like? The going rate for chat bots like ChatGPT is $20/month. Does that go to $40 a month? $400? $4,000?
replies(2): >>44474112 #>>44475227 #
4. handfuloflight ◴[] No.44474112{3}[source]
How much would OpenAI be burning per month if each monthly active user cost them $40? $400? $4000?

The numbers would bankrupt them within weeks.

5. fragmede ◴[] No.44474853[source]
It depends on how far behind you believe the model-available LLMs are. If I can buy, say, $10k worth of hardware and run a sufficiently equivalent LLM at home for the cost of that plus electricity, and amortize that over say 5 years to get $2k/yr plus electricity, and say you use it 40 hours a week for 50 weeks, for 2000 hours, gets you $1/hr plus electricity. That electrical cost will vary depending on location, but let's just handwave $1/hr (which should be high). So $2/hr vs ChatGPT's $0.11/hr if you pay $20/month and use it 174 hours per month.

Feel free to challenge these numbers, but it's a starting place. What's not accounted for is the cost of training (compute time, but also employee and everything else), which needs to be amortized over the length of time a model is used, so ChatGPT's costs rise significantly, but they do have the advantage that hardware is shared across multiple users.

replies(1): >>44475184 #
6. nbardy ◴[] No.44475184[source]
These estimates are way off. The concurrent requests are near free with the right serving infrastructure. The throughput per token per dollar is 1/100-1/1000 the price for a full saturated node.
7. cratermoon ◴[] No.44475227{3}[source]
OK, how about another article that mentions the other big players, including Anthropic, Microsoft, and Google. https://www.wheresyoured.at/reality-check/